860 resultados para two-stage sampling


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Centrifugal pumps are vastly used in many industrial applications. Knowledge of how these components behave in several circumstances is crucial for the development of more efficient and, therefore, less expensive pumping installations. The combination of multiple impellers, vaned diffusers and a volute might introduce several complex flow characteristics that largely deviate from regular inviscid pump flow theory. Computational Fluid Dynamics can be very helpful to extract information about which physical phenomena are involved in such flows. In this sense, this work performs a numerical study of the flow in a two-stage centrifugal pump (Imbil ITAP 65-330/2) with a vaned diffuser and a volute. The flow in the pump is modeled using the software Ansys CFX, by means of a multi-block, transient rotor-stator technique, with structured grids for all pump parts. The simulations were performed using water and a mixture of water and glycerin as work fluids. Several viscosities were considered, in a range between 87 and 720 cP. Comparisons between experimental data obtained by Amaral (2007) and numerical head curves showed a good agreement, with an average deviation of 6.8% for water. The behavior of velocity, pressure and turbulence kinetic energy fields was evaluated for several operational conditions. In general, the results obtained by this work achieved the proposed goals and are a significant contribution to the understanding of the flow studied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An indirect genetic algorithm for the non-unicost set covering problem is presented. The algorithm is a two-stage meta-heuristic, which in the past was successfully applied to similar multiple-choice optimisation problems. The two stages of the algorithm are an ‘indirect’ genetic algorithm and a decoder routine. First, the solutions to the problem are encoded as permutations of the rows to be covered, which are subsequently ordered by the genetic algorithm. Fitness assignment is handled by the decoder, which transforms the permutations into actual solutions to the set covering problem. This is done by exploiting both problem structure and problem specific information. However, flexibility is retained by a self-adjusting element within the decoder, which allows adjustments to both the data and to stages within the search process. Computational results are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação de Paula Frassinetti para obtenção de grau de Mestre em Educação Pré-Escolar e Ensino do 1.º Ciclo do Ensino Básico

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: A study of the correlation between the particle size of lignocellulosic substrates and ultrasound pretreatment on the efficiency of further enzymatic hydrolysis and fermentation to ethanol. Results: Themaximumconcentrations of glucose and, to a lesser extent, di- and trisaccharideswere obtained in a series of experiments with 48-h enzymatic hydrolysis of pine rawmaterials ground at 380–400 rpm for 30min. The highest glucose yield was observed at the end of the hydrolysis with a cellulase dosage of 10 mg of protein (204 ± 21 units CMCase per g of sawdust). The greatest enzymatic hydrolysis efficiency was observed in a sample that combined two-stage grinding at 400 rpm with ultrasonic treatment for 5–10 min at a power of 10 W per kg of sawdust. The glucose yield in this case (35.5 g glucose l−1) increased twofold compared to ground substrate without further preparation. Conclusions: Using a mechanical two-stage grinding of lignocellulosic raw materials with ultrasonication increases the efficiency of subsequent enzymatic hydrolysis and fermentation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The size of online image datasets is constantly increasing. Considering an image dataset with millions of images, image retrieval becomes a seemingly intractable problem for exhaustive similarity search algorithms. Hashing methods, which encodes high-dimensional descriptors into compact binary strings, have become very popular because of their high efficiency in search and storage capacity. In the first part, we propose a multimodal retrieval method based on latent feature models. The procedure consists of a nonparametric Bayesian framework for learning underlying semantically meaningful abstract features in a multimodal dataset, a probabilistic retrieval model that allows cross-modal queries and an extension model for relevance feedback. In the second part, we focus on supervised hashing with kernels. We describe a flexible hashing procedure that treats binary codes and pairwise semantic similarity as latent and observed variables, respectively, in a probabilistic model based on Gaussian processes for binary classification. We present a scalable inference algorithm with the sparse pseudo-input Gaussian process (SPGP) model and distributed computing. In the last part, we define an incremental hashing strategy for dynamic databases where new images are added to the databases frequently. The method is based on a two-stage classification framework using binary and multi-class SVMs. The proposed method also enforces balance in binary codes by an imbalance penalty to obtain higher quality binary codes. We learn hash functions by an efficient algorithm where the NP-hard problem of finding optimal binary codes is solved via cyclic coordinate descent and SVMs are trained in a parallelized incremental manner. For modifications like adding images from an unseen class, we propose an incremental procedure for effective and efficient updates to the previous hash functions. Experiments on three large-scale image datasets demonstrate that the incremental strategy is capable of efficiently updating hash functions to the same retrieval performance as hashing from scratch.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Improper handling has been identified as one of the major reasons for the decline in vaccine potency at the time of administration. Loss of potency becomes evident when immunised individuals contract the diseases the vaccines were meant to prevent. Objective: Assessing the factors associated with vaccine handling and storage practices. Methods: This was a cross-sectional study. Three-stage sampling was used to recruit 380 vaccine handlers from 273 health facilities from 11 Local Government areas in Ibadan. Data was analysed using SPSS version 16 Results: Seventy-three percent were aware of vaccine handling and storage guidelines with 68.4% having ever read such guidelines. Only 15.3% read a guideline less than 1 month prior to the study. About 65.0% had received training on vaccine management. Incorrect handling practices reported included storing injections with vaccines (13.7%) and maintaining vaccine temperature using ice blocks (7.6%). About 43.0% had good knowledge of vaccine management, while 66.1% had good vaccine management practices. Respondents who had good knowledge of vaccine handling and storage [OR=10.0, 95%CI (5.28 – 18.94), p < 0.001] and had received formal training on vaccine management [OR=5.3, 95%CI (2.50 – 11.14), p< 0.001] were more likely to have good vaccine handling and storage practices. Conclusion: Regular training is recommended to enhance vaccine handling and storage practices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many different photovoltaic technologies are being developed for large-scale solar energy conversion such as crystalline silicon solar cells, thin film solar cells based on a-Si:H, CIGS and CdTe. As the demand for photovoltaics rapidly increases, there is a pressing need for the identification of new visible light absorbing materials for thin-film solar cells. Nowadays there are a wide range of earth-abundant absorber materials that have been studied around the world by different research groups. The current thin film photovoltaic market is dominated by technologies based on the use of CdTe and CIGS, these solar cells have been made with laboratory efficiencies up to 19.6% and 20.8% respectively. However, the scarcity and high cost of In, Ga and Te can limit in the long-term the production in large scale of photovoltaic devices. On the other hand, quaternary CZTSSe which contain abundant and inexpensive elements like Cu, Zn, Sn, S and Se has been a potential candidate for PV technology having solar cell efficiency up to 12.6%, however, there are still some challenges that must be accomplished for this material. Therefore, it is evident the need to find the alternative inexpensive and earth abundant materials for thin film solar cells. One of these alternatives is copper antimony sulfide(CuSbS2) which contains abundant and non-toxic elements which has a direct optical band gap of 1.5 eV, the optimum value for an absorber material in solar cells, suggesting this material as one among the new photovoltaic materials. This thesis work focuses on the preparation and characterization of In6Se7, CuSbS2 and CuSb(S1-xSex)2 thin films for their application as absorber material in photovoltaic structures using two stage process by the combination of chemical bath deposition and thermal evaporation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Matching theory and matching markets are a core component of modern economic theory and market design. This dissertation presents three original contributions to this area. The first essay constructs a matching mechanism in an incomplete information matching market in which the positive assortative match is the unique efficient and unique stable match. The mechanism asks each agent in the matching market to reveal her privately known type. Through its novel payment rule, truthful revelation forms an ex post Nash equilibrium in this setting. This mechanism works in one-, two- and many-sided matching markets, thus offering the first mechanism to unify these matching markets under a single mechanism design framework. The second essay confronts a problem of matching in an environment in which no efficient and incentive compatible matching mechanism exists due to matching externalities. I develop a two-stage matching game in which a contracting stage facilitates subsequent conditionally efficient and incentive compatible Vickrey auction stage. Infinite repetition of this two-stage matching game enforces the contract in every period. This mechanism produces inequitably distributed social improvement: parties to the contract receive all of the gains and then some. The final essay demonstrates the existence of prices which stably and efficiently partition a single set of agents into firms and workers, and match those two sets to each other. This pricing system extends Kelso and Crawford's general equilibrium results in a labor market matching model and links one- and two-sided matching markets as well.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El objetivo de este documento es obtener evidencia empírica acerca de la existencia de efectos asimétricos de la política monetaria sobre el nivel de actividad económica, con base en el comportamiento de la tasa de interés. Se observa un efecto asimétrico de la política monetaria cuando tasas de interés por encima de su nivel fundamental tienen un efecto sobre la actividad económica significativamente distinto del que tendría una tasa de interés por debajo de su nivel fundamental.La identificación de cambios en la tasa de interés que reflejan cambios de política se realiza por mínimos cuadrados en dos etapas. En la primera etapa, el nivel fundamental de la tasa de interés se estima con una regla de Taylor modificada y sus residuos son utilizados para identificar el estado de la política. La segunda etapa consiste en una regresión del producto real sobre una constante y los valores rezagados de los residuos positivos y negativos obtenidos en la primera etapa. La asimetría vendría determinada por la significancia estadística de los coeficientes individuales de los residuos positivos y negativos y de la diferencia entre estos.La evidencia empírica, para el periodo 1994:01-2002:11, sugiere la existencia de una asimetría débil de la política monetaria. Lo anterior debido a que aunque los incrementos y disminuciones en la tasa de interés afectan el nivel de producción significativamente, la diferencia del impacto no resulta significativa.AbstractThe objective of this paper is to obtain empirical evidence about the existence of asymmetric effects of monetary policy over economic activity, based on interest rate behavior. Monetary policy shows an asymmetric effect when an interest rate over their fundamental level have an impact on economic activity that is significantly different from that when interest rate are below its fundamental level.Changes in interest rate that reflect changes of policy are identified using two stage least squares. In the first stage, the fundamental level of the interest rate is estimated with a modified Taylor rule and residuals are used to identify the state of the policy. The second stage consists of a regression of the real output on a constant and lagged values of the positive and negative residuals obtained in the first stage. The asymmetry would come determined by the statistical significance of individual coefficients of positive and negative residuals and the difference between them.The empirical evidence, over the 1994:01-2002:11 period, suggests the existence of weak asymmetry of monetary policy. Although increases and reductions in interest rate affect the production level significantly, the difference of the impact is not significant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background and aims: A gluten-free diet is to date the only treatment available to celiac disease sufferers. However, systematic reviews indicate that, depending on the method of evaluation used, only 42% to 91% of patients adhere to the diet strictly. Transculturally adapted tools that evaluate adherence beyond simple self-informed questions or invasive analyses are, therefore, of importance. The aim is to obtain a Spanish transcultural adaption and validation of Leffler's Celiac Dietary Adherence Test. Methods: A two-stage observational transversal study: translation and back translation by four qualified translators followed by a validation stage in which the questionnaire was administered to 306 celiac disease patients aged between 12 and 72 years and resident in Aragon. Factorial structure, criteria validity and internal consistency were evaluated. Results: The Spanish version maintained the 7 items in a 3-factor structure. Feasibility was very high in all the questions answered and the floor and ceiling effects were very low (4.3% and 1%, respectively). The Spearman correlation with the self-efficacy and life quality scales and the self-informed question were statistically significant (p < 0.01). According to the questionnaire criteria, adherence was 72.3%. Conclusion: The Spanish version of the Celiac Dietary Adherence Test shows appropriate psychometric properties and is, therefore, suitable for studying adherence to a gluten-free diet in clinical and research environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado em Estudos Marinhos e Costeiros, Unidade de Ciências e Tecnologias dos Recursos Aquáticos, Univ. do Algarve, 2000

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of infrared burners in industrial applications has many advantages in terms of technical-operational, for example, uniformity in the heat supply in the form of radiation and convection, with greater control of emissions due to the passage of exhaust gases through a macro-porous ceramic bed. This paper presents an infrared burner commercial, which was adapted an experimental ejector, capable of promoting a mixture of liquefied petroleum gas (LPG) and glycerin. By varying the percentage of dual-fuel, it was evaluated the performance of the infrared burner by performing an energy balance and atmospheric emissions. It was introduced a temperature controller with thermocouple modulating two-stage (low heat / high heat), using solenoid valves for each fuel. The infrared burner has been tested and tests by varying the amount of glycerin inserted by a gravity feed system. The method of thermodynamic analysis to estimate the load was used an aluminum plate located at the exit of combustion gases and the distribution of temperatures measured by a data acquisition system which recorded real-time measurements of the thermocouples attached. The burner had a stable combustion at levels of 15, 20 and 25% of adding glycerin in mass ratio of LPG gas, increasing the supply of heat to the plate. According to data obtained showed that there was an improvement in the efficiency of the 1st Law of infrared burner with increasing addition of glycerin. The emission levels of greenhouse gases produced by combustion (CO, NOx, SO2 and HC) met the environmental limits set by resolution No. 382/2006 of CONAMA

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La conciliación de medicamentos es la adecuada combinación de conocimientos y evidencias científicas de las reacciones, interacciones y necesidades de los pacientes, constituye en esencial el buen uso de los medicamentos. Objetivo general: Establecer la conciliación de medicamentos e identificar los tipos de discrepancias existentes al ingreso, durante la hospitalización y al alta en las pacientes del área de ginecología del Hospital Vicente Corral Moscoso. Cuenca, durante los meses noviembre – diciembre 2015. Metodología: Se diseñó un estudio descriptivo, con un población de 200 pacientes hospitalizadas en el área de ginecología del Hospital Vicente Corral Moscoso, durante 2 meses del 2015, recolectamos los datos mediante un formulario de dos etapas para la conciliación, a partir de las prescripciones de la historia clínica y entrevista a las pacientes, los que fueron ingresados en el software SPSS 15.0 para su tabulación, análisis, y presentación en tablas. Resultados: Se encontró 161 errores de conciliación y 42 discrepancias justificadas, en promedio 1,87discrepancias no justificadas por paciente. El error de conciliación más frecuente al ingreso corresponde a diferente dosis, vía y frecuencia de administración con un 84,6%, durante la hospitalización y al alta, correspondió a prescripciones incompletas con el 40% y 60,3% respectivamente. Conclusiones: La frecuencia con la que se realiza la conciliación de medicamentos en el Hospital Vicente Corral Moscoso fue del 15%. El 52% de pacientes están expuestos a riesgo por discordancias en las prescripciones, de ellos 43% son errores en la conciliación y un 9 % son discordancias justificadas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hardboard processing wastewater was evaluated as a feedstock in a bio refinery co-located with the hardboard facility for the production of fuel grade ethanol. A thorough characterization was conducted on the wastewater and the composition changes of which during the process in the bio refinery were tracked. It was determined that the wastewater had a low solid content (1.4%), and hemicellulose was the main component in the solid, accounting for up to 70%. Acid pretreatment alone can hydrolyze the majority of the hemicellulose as well as oligomers, and over 50% of the monomer sugars generated were xylose. The percentage of lignin remained in the liquid increased after acid pretreatment. The characterization results showed that hardboard processing wastewater is a feasible feedstock for the production of ethanol. The optimum conditions to hydrolyze hemicellulose into fermentable sugars were evaluated with a two-stage experiment, which includes acid pretreatment and enzymatic hydrolysis. The experimental data were fitted into second order regression models and Response Surface Methodology (RSM) was employed. The results of the experiment showed that for this type of feedstock enzymatic hydrolysis is not that necessary. In order to reach a comparatively high total sugar concentration (over 45g/l) and low furfural concentration (less than 0.5g/l), the optimum conditions were reached when acid concentration was between 1.41 to 1.81%, and reaction time was 48 to 76 minutes. The two products produced from the bio refinery were compared with traditional products, petroleum gasoline and traditional potassium acetate, in the perspective of sustainability, with greenhouse gas (GHG) emission as an indicator. Three allocation methods, system expansion, mass allocation and market value allocation methods were employed in this assessment. It was determined that the life cycle GHG emissions of ethanol were -27.1, 20.8 and 16 g CO2 eq/MJ, respectively, in the three allocation methods, whereas that of petroleum gasoline is 90 g CO2 eq/MJ. The life cycle GHG emissions of potassium acetate in mass allocation and market value allocation method were 555.7 and 716.0 g CO2 eq/kg, whereas that of traditional potassium acetate is 1020 g CO2/kg.