998 resultados para Sequential selection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization of photo-Fenton degradation of copper phthalocyanine blue was achieved by response surface methodology (RSM) constructed with the aid of a sequential injection analysis (SIA) system coupled to a homemade photo-reactor. Highest degradation percentage was obtained at the following conditions [H(2)O(2)]/[phthalocyanine] = 7, [H(2)O(2)]/[FeSO(4)] = 10, pH = 2.5, and stopped flow time in the photo reactor = 30 s. The SIA system was designed to prepare a monosegment containing the reagents and sample, to pump it toward the photo-reactor for the specified time and send the products to a flow-through spectrophotometer for monitoring the color reduction of the dye. Changes in parameters such as reagent molar ratios. residence time and pH were made by modifications in the software commanding the SI system, without the need for physical reconfiguration of reagents around the selection valve. The proposed procedure and system fed the statistical program with degradation data for fast construction of response surface plots. After optimization, 97% of the dye was degraded. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies cost-sharing rules under dynamic adverse selection. We present a typical principal-agent model with two periods, set up in Laffont and Tirole's (1986) canonical regulation environment. At first, when the contract is signed, the firm has prior uncertainty about its efficiency parameter. In the second period, the firm learns its efficiency and chooses the level of cost-reducing effort. The optimal mechanism sequentially screens the firm's types and achieves a higher level of welfare than its static counterpart. The contract is indirectly implemented by a sequence of transfers, consisting of a fixed advance payment based on the reported cost estimate, and an ex-post compensation linear in cost performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species diversity itself may cause additional species diversity. According to recent findings, some species modify their environment in such a way that they facilitate the creation of new niches for other species to evolve to fill. Given the vast speciesdiversity of insects, the occurrence of such sequential radiation of species is likely common among herbivorous insects and the species that depend on them, many of them being insects as well. Herbivorous insects often have close associations with specific host plants and their preferences for mating and ovipositing on a specific host-plant species can reproductively isolate host-specific populations, facilitating speciation. Previous research by our laboratory has established that there are two distinct populations of thegall fly, Eurosta solidaginis (Tephritidae), which attack different species of goldenrods, Solidago altissima (Asteraceae) and S. gigantea. The gall fly’s host-associated differentiation is facilitating the divergence and potential speciation of twosubpopulations of the gall-boring beetle Mordellistena convicta (Mordellidae) by providing new resources (galls on stems of the galdenrods) for the gall-boring beetles. These beetles exist as two host-plant associated populations of inquilines that inhabit the galls induced by the gall fly. While our previous research has provided genetic and behavioral evidence for host-race formation, little is known about the role of their host plants in assortative mating and oviposition-site selection of the gall-boring beetles’ hostassociated populations. Volatile emissions from host plants can play a major role in assisting herbivores to locate their natal host plants and thus facilitate assortative mating and host-specific oviposition. The present study investigated the role of host-plant volatiles in host fidelity (mating on the host plant) and oviposition preference of M. convicta by measuring its behavioral responses to the host-plant volatile emissions using Y-tube olfactometers. In total, we tested behavioral responses of 615 beetles. Our resultsshow that M. convicta adults are attracted to their natal host galls (67% of S. altissima-emerging beetles and 70% of S. gigantea-emerging beetles) and avoid the alternate host galls (75% of S. altissima-emerging beetles and 66% of S. gigantea-emerging beetles),while showing no preference for, or avoidance of, ungalled plants from either species. This suggests that the gall beetles can orient to the volatile chemicals emitted by the galls and can potentially use them to identify suitable sites for mating and/or oviposition. Thus, host-associated mating and oviposition may play a role in the sequential speciation of the gall-boring beetle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When conducting a randomized comparative clinical trial, ethical, scientific or economic considerations often motivate the use of interim decision rules after successive groups of patients have been treated. These decisions may pertain to the comparative efficacy or safety of the treatments under study, cost considerations, the desire to accelerate the drug evaluation process, or the likelihood of therapeutic benefit for future patients. At the time of each interim decision, an important question is whether patient enrollment should continue or be terminated; either due to a high probability that one treatment is superior to the other, or a low probability that the experimental treatment will ultimately prove to be superior. The use of frequentist group sequential decision rules has become routine in the conduct of phase III clinical trials. In this dissertation, we will present a new Bayesian decision-theoretic approach to the problem of designing a randomized group sequential clinical trial, focusing on two-arm trials with time-to-failure outcomes. Forward simulation is used to obtain optimal decision boundaries for each of a set of possible models. At each interim analysis, we use Bayesian model selection to adaptively choose the model having the largest posterior probability of being correct, and we then make the interim decision based on the boundaries that are optimal under the chosen model. We provide a simulation study to compare this method, which we call Bayesian Doubly Optimal Group Sequential (BDOGS), to corresponding frequentist designs using either O'Brien-Fleming (OF) or Pocock boundaries, as obtained from EaSt 2000. Our simulation results show that, over a wide variety of different cases, BDOGS either performs at least as well as both OF and Pocock, or on average provides a much smaller trial. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

County jurisdictions in America are increasingly exercising self-government in the provision of public community services through the context of second order federalism. In states exercising this form of contemporary governance, county governments with "reformed" policy-making structures and professional management practices, have begun to rival or surpass municipalities in the delivery of local services with regional implications such as environmental protection (Benton 2002, 2003; Marando and Reeves, 1993). ^ The voter referendum, a form of direct democracy, is an important component of county land preservation and environmental protection governmental policies. The recent growth and success of land preservation voter referendums nationwide reflects an increase in citizen participation in government and their desire to protect vacant land and its natural environment from threats of over-development, urbanization and sprawl, loss of open space and farmland, deterioration of ecosystems, and inadequate park and recreational amenities. ^ The study's design employs a sequential, mixed method. First, a quantitative approach employs the Heckman two-step model. It is fitted with variables for the non-random sample of 227 voter referendum counties and all non-voter referendum counties in the U.S. from 1988 to 2009. Second, the qualitative data collected from the in-depth investigation of three South Florida county case studies with twelve public administrator interviews is transformed for integration with the quantitative findings. The purpose of the qualitative method is to complement, explain and enrich the statistical analysis of county demographic, socio-economic, terrain, regional, governance and government, political preference, environmentalism, and referendum-specific factors. ^ The research finds that government factors are significant in terms of the success of land preservation voter referendums; more specifically, the presence of self-government authority (home rule charter), a reformed structure (county administrator/manager or elected executive), and environmental interest groups. In addition, this study concludes that successful counties are often located coastal, exhibit population and housing growth, and have older and more educated citizens who vote democratic in presidential elections. The analysis of case study documents and public administrator interviews finds that pragmatic considerations of timing, local politics and networking of regional stakeholders are also important features of success. Further research is suggested utilizing additional public participation, local government and public administration factors.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A decision-maker, when faced with a limited and fixed budget to collect data in support of a multiple attribute selection decision, must decide how many samples to observe from each alternative and attribute. This allocation decision is of particular importance when the information gained leads to uncertain estimates of the attribute values as with sample data collected from observations such as measurements, experimental evaluations, or simulation runs. For example, when the U.S. Department of Homeland Security must decide upon a radiation detection system to acquire, a number of performance attributes are of interest and must be measured in order to characterize each of the considered systems. We identified and evaluated several approaches to incorporate the uncertainty in the attribute value estimates into a normative model for a multiple attribute selection decision. Assuming an additive multiple attribute value model, we demonstrated the idea of propagating the attribute value uncertainty and describing the decision values for each alternative as probability distributions. These distributions were used to select an alternative. With the goal of maximizing the probability of correct selection we developed and evaluated, under several different sets of assumptions, procedures to allocate the fixed experimental budget across the multiple attributes and alternatives. Through a series of simulation studies, we compared the performance of these allocation procedures to the simple, but common, allocation procedure that distributed the sample budget equally across the alternatives and attributes. We found the allocation procedures that were developed based on the inclusion of decision-maker knowledge, such as knowledge of the decision model, outperformed those that neglected such information. Beginning with general knowledge of the attribute values provided by Bayesian prior distributions, and updating this knowledge with each observed sample, the sequential allocation procedure performed particularly well. These observations demonstrate that managing projects focused on a selection decision so that the decision modeling and the experimental planning are done jointly, rather than in isolation, can improve the overall selection results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural language processing has achieved great success in a wide range of ap- plications, producing both commercial language services and open-source language tools. However, most methods take a static or batch approach, assuming that the model has all information it needs and makes a one-time prediction. In this disser- tation, we study dynamic problems where the input comes in a sequence instead of all at once, and the output must be produced while the input is arriving. In these problems, predictions are often made based only on partial information. We see this dynamic setting in many real-time, interactive applications. These problems usually involve a trade-off between the amount of input received (cost) and the quality of the output prediction (accuracy). Therefore, the evaluation considers both objectives (e.g., plotting a Pareto curve). Our goal is to develop a formal understanding of sequential prediction and decision-making problems in natural language processing and to propose efficient solutions. Toward this end, we present meta-algorithms that take an existent batch model and produce a dynamic model to handle sequential inputs and outputs. Webuild our framework upon theories of Markov Decision Process (MDP), which allows learning to trade off competing objectives in a principled way. The main machine learning techniques we use are from imitation learning and reinforcement learning, and we advance current techniques to tackle problems arising in our settings. We evaluate our algorithm on a variety of applications, including dependency parsing, machine translation, and question answering. We show that our approach achieves a better cost-accuracy trade-off than the batch approach and heuristic-based decision- making approaches. We first propose a general framework for cost-sensitive prediction, where dif- ferent parts of the input come at different costs. We formulate a decision-making process that selects pieces of the input sequentially, and the selection is adaptive to each instance. Our approach is evaluated on both standard classification tasks and a structured prediction task (dependency parsing). We show that it achieves similar prediction quality to methods that use all input, while inducing a much smaller cost. Next, we extend the framework to problems where the input is revealed incremen- tally in a fixed order. We study two applications: simultaneous machine translation and quiz bowl (incremental text classification). We discuss challenges in this set- ting and show that adding domain knowledge eases the decision-making problem. A central theme throughout the chapters is an MDP formulation of a challenging problem with sequential input/output and trade-off decisions, accompanied by a learning algorithm that solves the MDP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

County jurisdictions in America are increasingly exercising self-government in the provision of public community services through the context of second order federalism. In states exercising this form of contemporary governance, county governments with “reformed” policy-making structures and professional management practices, have begun to rival or surpass municipalities in the delivery of local services with regional implications such as environmental protection (Benton 2002, 2003; Marando and Reeves, 1993). The voter referendum, a form of direct democracy, is an important component of county land preservation and environmental protection governmental policies. The recent growth and success of land preservation voter referendums nationwide reflects an increase in citizen participation in government and their desire to protect vacant land and its natural environment from threats of over-development, urbanization and sprawl, loss of open space and farmland, deterioration of ecosystems, and inadequate park and recreational amenities. The study’s design employs a sequential, mixed method. First, a quantitative approach employs the Heckman two-step model. It is fitted with variables for the non-random sample of 227 voter referendum counties and all non-voter referendum counties in the U.S. from 1988 to 2009. Second, the qualitative data collected from the in-depth investigation of three South Florida county case studies with twelve public administrator interviews is transformed for integration with the quantitative findings. The purpose of the qualitative method is to complement, explain and enrich the statistical analysis of county demographic, socio-economic, terrain, regional, governance and government, political preference, environmentalism, and referendum-specific factors. The research finds that government factors are significant in terms of the success of land preservation voter referendums; more specifically, the presence of self-government authority (home rule charter), a reformed structure (county administrator/manager or elected executive), and environmental interest groups. In addition, this study concludes that successful counties are often located coastal, exhibit population and housing growth, and have older and more educated citizens who vote democratic in presidential elections. The analysis of case study documents and public administrator interviews finds that pragmatic considerations of timing, local politics and networking of regional stakeholders are also important features of success. Further research is suggested utilizing additional public participation, local government and public administration factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The genera Cochliomyia and Chrysomya contain both obligate and saprophagous flies, which allows the comparison of different feeding habits between closely related species. Among the different strategies for comparing these habits is the use of qPCR to investigate the expression levels of candidate genes involved in feeding behavior. To ensure an accurate measure of the levels of gene expression, it is necessary to normalize the amount of the target gene with the amount of a reference gene having a stable expression across the compared species. Since there is no universal gene that can be used as a reference in functional studies, candidate genes for qPCR data normalization were selected and validated in three Calliphoridae (Diptera) species, Cochliomyia hominivorax Coquerel, Cochliomyia macellaria Fabricius, and Chrysomya albiceps Wiedemann . The expression stability of six genes ( Actin, Gapdh, Rp49, Rps17, α -tubulin, and GstD1) was evaluated among species within the same life stage and between life stages within each species. The expression levels of Actin, Gapdh, and Rp49 were the most stable among the selected genes. These genes can be used as reliable reference genes for functional studies in Calliphoridae using similar experimental settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compared the indication of laparoscopy for treatment of adnexal masses based on the risk scores and tumor diameters with the indication based on gynecology-oncologists' experience. This was a prospective study of 174 women who underwent surgery for adnexal tumors (116 laparotomies, 58 laparoscopies). The surgeries begun and completed by laparoscopy, with benign pathologic diagnosis, were considered successful. Laparoscopic surgeries that required conversion to laparotomy, led to a malignant diagnosis, or facilitated cyst rupture were considered failures. Two groups were defined for laparoscopy indication: (1) absence of American College of Obstetrics and Gynecology (ACOG) guideline for referral of high-risk adnexal masses criteria (ACOG negative) associated with 3 different tumor sizes (10, 12, and 14 cm); and (2) Index of Risk of Malignancy (IRM) with cutoffs at 100, 200, and 300, associated with the same 3 tumor sizes. Both groups were compared with the indication based on the surgeon's experience to verify whether the selection based on strict rules would improve the rate of successful laparoscopy. ACOG-negative and tumors ≤10 cm and IRM with a cutoff at 300 points and tumors ≤10cm resulted in the same best performance (78% success = 38/49 laparoscopies). However, compared with the results of the gynecology-oncologists' experience, those were not statistically significant. The selection of patients with adnexal mass to laparoscopy by the use of the ACOG guideline or IRM associated with tumor diameter had similar performance as the experience of gynecology-oncologists. Both methods are reproducible and easy to apply to all women with adnexal masses and could be used by general gynecologists to select women for laparoscopic surgery; however, referral to a gynecology-oncologist is advisable when there is any doubt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The n→π* absorption transition of formaldehyde in water is analyzed using combined and sequential classical Monte Carlo (MC) simulations and quantum mechanics (QM) calculations. MC simulations generate the liquid solute-solvent structures for subsequent QM calculations. Using time-dependent density functional theory in a localized set of gaussian basis functions (TD-DFT/6-311++G(d,p)) calculations are made on statistically relevant configurations to obtain the average solvatochromic shift. All results presented here use the electrostatic embedding of the solvent. The statistically converged average result obtained of 2300 cm-1 is compared to previous theoretical results available. Analysis is made of the effective dipole moment of the hydrogen-bonded shell and how it could be held responsible for the polarization of the solvent molecules in the outer solvation shells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a sequential injection chromatography procedure for determination of picloram in waters exploring the low backpressure of a 2.5 cm long monolithic C18 column. Separation of the analyte from the matrix was achieved in less than 60 s using a mobile phase composed by 20:80 (v v-1) acetonitrile:5.0 mmol L-1 H3PO4 and flow rate of 30 μL s-1. Detection was made at 223 nm with a 40 mm optical path length cell. The limits of detection and quantification were 33 and 137 μg L-1, respectively. The proposed method is sensitive enough to monitor the maximum concentration level for picloram in drinking water (500 μg L-1). The sampling frequency is 60 analyses per hour, consuming only 300 μL of acetonitrile per analysis. The proposed methodology was applied to spiked river water samples and no statistically significant differences were observed in comparison to a conventional HPLC-UV method.