887 resultados para Two-stage PLS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of infrared burners in industrial applications has many advantages in terms of technical-operational, for example, uniformity in the heat supply in the form of radiation and convection, with greater control of emissions due to the passage of exhaust gases through a macro-porous ceramic bed. This paper presents an infrared burner commercial, which was adapted an experimental ejector, capable of promoting a mixture of liquefied petroleum gas (LPG) and glycerin. By varying the percentage of dual-fuel, it was evaluated the performance of the infrared burner by performing an energy balance and atmospheric emissions. It was introduced a temperature controller with thermocouple modulating two-stage (low heat / high heat), using solenoid valves for each fuel. The infrared burner has been tested and tests by varying the amount of glycerin inserted by a gravity feed system. The method of thermodynamic analysis to estimate the load was used an aluminum plate located at the exit of combustion gases and the distribution of temperatures measured by a data acquisition system which recorded real-time measurements of the thermocouples attached. The burner had a stable combustion at levels of 15, 20 and 25% of adding glycerin in mass ratio of LPG gas, increasing the supply of heat to the plate. According to data obtained showed that there was an improvement in the efficiency of the 1st Law of infrared burner with increasing addition of glycerin. The emission levels of greenhouse gases produced by combustion (CO, NOx, SO2 and HC) met the environmental limits set by resolution No. 382/2006 of CONAMA

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La conciliación de medicamentos es la adecuada combinación de conocimientos y evidencias científicas de las reacciones, interacciones y necesidades de los pacientes, constituye en esencial el buen uso de los medicamentos. Objetivo general: Establecer la conciliación de medicamentos e identificar los tipos de discrepancias existentes al ingreso, durante la hospitalización y al alta en las pacientes del área de ginecología del Hospital Vicente Corral Moscoso. Cuenca, durante los meses noviembre – diciembre 2015. Metodología: Se diseñó un estudio descriptivo, con un población de 200 pacientes hospitalizadas en el área de ginecología del Hospital Vicente Corral Moscoso, durante 2 meses del 2015, recolectamos los datos mediante un formulario de dos etapas para la conciliación, a partir de las prescripciones de la historia clínica y entrevista a las pacientes, los que fueron ingresados en el software SPSS 15.0 para su tabulación, análisis, y presentación en tablas. Resultados: Se encontró 161 errores de conciliación y 42 discrepancias justificadas, en promedio 1,87discrepancias no justificadas por paciente. El error de conciliación más frecuente al ingreso corresponde a diferente dosis, vía y frecuencia de administración con un 84,6%, durante la hospitalización y al alta, correspondió a prescripciones incompletas con el 40% y 60,3% respectivamente. Conclusiones: La frecuencia con la que se realiza la conciliación de medicamentos en el Hospital Vicente Corral Moscoso fue del 15%. El 52% de pacientes están expuestos a riesgo por discordancias en las prescripciones, de ellos 43% son errores en la conciliación y un 9 % son discordancias justificadas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hardboard processing wastewater was evaluated as a feedstock in a bio refinery co-located with the hardboard facility for the production of fuel grade ethanol. A thorough characterization was conducted on the wastewater and the composition changes of which during the process in the bio refinery were tracked. It was determined that the wastewater had a low solid content (1.4%), and hemicellulose was the main component in the solid, accounting for up to 70%. Acid pretreatment alone can hydrolyze the majority of the hemicellulose as well as oligomers, and over 50% of the monomer sugars generated were xylose. The percentage of lignin remained in the liquid increased after acid pretreatment. The characterization results showed that hardboard processing wastewater is a feasible feedstock for the production of ethanol. The optimum conditions to hydrolyze hemicellulose into fermentable sugars were evaluated with a two-stage experiment, which includes acid pretreatment and enzymatic hydrolysis. The experimental data were fitted into second order regression models and Response Surface Methodology (RSM) was employed. The results of the experiment showed that for this type of feedstock enzymatic hydrolysis is not that necessary. In order to reach a comparatively high total sugar concentration (over 45g/l) and low furfural concentration (less than 0.5g/l), the optimum conditions were reached when acid concentration was between 1.41 to 1.81%, and reaction time was 48 to 76 minutes. The two products produced from the bio refinery were compared with traditional products, petroleum gasoline and traditional potassium acetate, in the perspective of sustainability, with greenhouse gas (GHG) emission as an indicator. Three allocation methods, system expansion, mass allocation and market value allocation methods were employed in this assessment. It was determined that the life cycle GHG emissions of ethanol were -27.1, 20.8 and 16 g CO2 eq/MJ, respectively, in the three allocation methods, whereas that of petroleum gasoline is 90 g CO2 eq/MJ. The life cycle GHG emissions of potassium acetate in mass allocation and market value allocation method were 555.7 and 716.0 g CO2 eq/kg, whereas that of traditional potassium acetate is 1020 g CO2/kg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel approach to the determination of steroid entrapment in the bilayers of aerosolised liposomes has been introduced using high-sensitivity differential scanning calorimetry (DSC). Proliposomes were dispersed in water within an air-jet nebuliser and the energy produced during atomisation was used to hydrate the proliposomes and generate liposome aerosols. Proliposomes that included the steroid beclometasone dipropionate (BDP) produced lower aerosol and lipid outputs than steroid-free proliposomes. Size analysis and transmission electron microscopy showed an evidence of liposome formation within the nebuliser, which was followed by deaggregation and size reduction of multilamellar liposomes on nebulisation to a two-stage impinger. For each formulation, no difference in thermal transitions was observed between delivered liposomes and those remaining in the nebuliser. However, steroid (5 mole%) lowered the onset temperature and the enthalpy of the pretransition, and produced a similar onset temperature and larger enthalpy of the main transition, with broadened pretransition and main transitions. This indicates that BDP was entrapped and exhibited an interaction with the liposome phospholipid membranes. Since the pretransition was depressed but not completely removed and no phase separation occurred, it is suggested that the bilayers of the multilamellar liposomes can entrap more than 5 mole% BDP. Overall, liposomes were generated from proliposomes and DSC investigations indicated that the steroid was entrapped in the bilayers of aerosolised multilamellar vesicles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report a two-stage diode-pumped Er-doped fiber amplifier operating at the wavelength of 1550 nm at the repetition rate of 10-100 kHz with an average output power of up to 10 W. The first stage comprising Er-doped fiber was core-pumped at the wavelength of 1480 nm, whereas the second stage comprising double-clad Er/Yb-doped fiber was clad-pumped at the wavelength of 975 nm. The estimated peak power for the 0.4-nm full-width at half-maximum laser emission at the wavelength of 1550 nm exceeded 4-kW level. The initial 100-ns seed diode laser pulse was compressed to 3.5 ns as a result of the 34-dB total amplification. The observed 30-fold efficient pulse compression reveals a promising new nonlinear optical technique for the generation of high power short pulses for applications in eye-safe ranging and micromachining.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Master, Chemical Engineering) -- Queen's University, 2016-08-16 04:58:55.749

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates how textbook design may influence students’ visual attention to graphics, photos and text in current geography textbooks. Eye tracking, a visual method of data collection and analysis, was utilised to precisely monitor students’ eye movements while observing geography textbook spreads. In an exploratory study utilising random sampling, the eye movements of 20 students (secondary school students 15–17 years of age and university students 20–24 years of age) were recorded. The research entities were double-page spreads of current German geography textbooks covering an identical topic, taken from five separate textbooks. A two-stage test was developed. Each participant was given the task of first looking at the entire textbook spread to determine what was being explained on the pages. In the second stage, participants solved one of the tasks from the exercise section. Overall, each participant studied five different textbook spreads and completed five set tasks. After the eye tracking study, each participant completed a questionnaire. The results may verify textbook design as one crucial factor for successful knowledge acquisition from textbooks. Based on the eye tracking documentation, learning-related challenges posed by images and complex image-text structures in textbooks are elucidated and related to educational psychology insights and findings from visual communication and textbook analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Single stage and two-stage sodium sulfite cooking were carried out on either spruce, pine or pure pine heartwood chips to investigate the influence of several process parameters on the initial phase of such a cook down to about 60 % pulp yield. The cooking experiments were carried out in the laboratory with either a lab-prepared or a mill-prepared cooking acid and the temperature and time were varied. The influences of dissolved organic and inorganic components in the cooking liquor on the final pulp composition and on the extent of side reactions were investigated. Kinetic equations were developed and the activation energies for delignification and carbohydrate dissolution were calculated using the Arrhenius equation. A better understanding of the delignification mechanisms during bisulfite and acid sulfite cooking was obtained by analyzing the lignin carbohydrate complexes (LCC) present in the pulp when different cooking conditions were used. It was found that using a mill-prepared cooking acid beneficial effect with respect to side reactions, extractives removal and higher stability in pH during the cook were observed compared to a lab-prepared cooking acid. However, no significant difference in degrees of delignification or carbohydrate degradation was seen.  The cellulose yield was not affected in the initial phase of the cook however; temperature had an influence on the rates of both delignification and hemicellulose removal. It was also found that the  corresponding activation energies increased in the order:  xylan, glucomannan, lignin and cellulose. The cooking temperature could thus be used to control the cook to a given carbohydrate composition in the final pulp. Lignin condensation reactions were observed during acid sulfite cooking, especially at higher temperatures. The LCC studies indicated the existence of covalent bonds between lignin and hemicellulose components with respect to xylan and glucomannan. LCC in native wood showed the presence of phenyl glycosides, ϒ-esters and α-ethers; whereas the α-ethers  were affected during sulfite pulping. The existence of covalent bonds between lignin and wood polysaccharides might be the rate-limiting factor in sulfite pulping.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Three-Dimensional Single-Bin-Size Bin Packing Problem is one of the most studied problem in the Cutting & Packing category. From a strictly mathematical point of view, it consists of packing a finite set of strongly heterogeneous “small” boxes, called items, into a finite set of identical “large” rectangles, called bins, minimizing the unused volume and requiring that the items are packed without overlapping. The great interest is mainly due to the number of real-world applications in which it arises, such as pallet and container loading, cutting objects out of a piece of material and packaging design. Depending on these real-world applications, more objective functions and more practical constraints could be needed. After a brief discussion about the real-world applications of the problem and a exhaustive literature review, the design of a two-stage algorithm to solve the aforementioned problem is presented. The algorithm must be able to provide the spatial coordinates of the placed boxes vertices and also the optimal boxes input sequence, while guaranteeing geometric, stability, fragility constraints and a reduced computational time. Due to NP-hard complexity of this type of combinatorial problems, a fusion of metaheuristic and machine learning techniques is adopted. In particular, a hybrid genetic algorithm coupled with a feedforward neural network is used. In the first stage, a rich dataset is created starting from a set of real input instances provided by an industrial company and the feedforward neural network is trained on it. After its training, given a new input instance, the hybrid genetic algorithm is able to run using the neural network output as input parameter vector, providing as output the optimal solution. The effectiveness of the proposed works is confirmed via several experimental tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last century, mathematical optimization has become a prominent tool for decision making. Its systematic application in practical fields such as economics, logistics or defense led to the development of algorithmic methods with ever increasing efficiency. Indeed, for a variety of real-world problems, finding an optimal decision among a set of (implicitly or explicitly) predefined alternatives has become conceivable in reasonable time. In the last decades, however, the research community raised more and more attention to the role of uncertainty in the optimization process. In particular, one may question the notion of optimality, and even feasibility, when studying decision problems with unknown or imprecise input parameters. This concern is even more critical in a world becoming more and more complex —by which we intend, interconnected —where each individual variation inside a system inevitably causes other variations in the system itself. In this dissertation, we study a class of optimization problems which suffer from imprecise input data and feature a two-stage decision process, i.e., where decisions are made in a sequential order —called stages —and where unknown parameters are revealed throughout the stages. The applications of such problems are plethora in practical fields such as, e.g., facility location problems with uncertain demands, transportation problems with uncertain costs or scheduling under uncertain processing times. The uncertainty is dealt with a robust optimization (RO) viewpoint (also known as "worst-case perspective") and we present original contributions to the RO literature on both the theoretical and practical side.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this PhD thesis a new firm level conditional risk measure is developed. It is named Joint Value at Risk (JVaR) and is defined as a quantile of a conditional distribution of interest, where the conditioning event is a latent upper tail event. It addresses the problem of how risk changes under extreme volatility scenarios. The properties of JVaR are studied based on a stochastic volatility representation of the underlying process. We prove that JVaR is leverage consistent, i.e. it is an increasing function of the dependence parameter in the stochastic representation. A feasible class of nonparametric M-estimators is introduced by exploiting the elicitability of quantiles and the stochastic ordering theory. Consistency and asymptotic normality of the two stage M-estimator are derived, and a simulation study is reported to illustrate its finite-sample properties. Parametric estimation methods are also discussed. The relation with the VaR is exploited to introduce a volatility contribution measure, and a tail risk measure is also proposed. The analysis of the dynamic JVaR is presented based on asymmetric stochastic volatility models. Empirical results with S&P500 data show that accounting for extreme volatility levels is relevant to better characterize the evolution of risk. The work is complemented by a review of the literature, where we provide an overview on quantile risk measures, elicitable functionals and several stochastic orderings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Latency can be defined as the sum of the arrival times at the customers. Minimum latency problems are specially relevant in applications related to humanitarian logistics. This thesis presents algorithms for solving a family of vehicle routing problems with minimum latency. First the latency location routing problem (LLRP) is considered. It consists of determining the subset of depots to be opened, and the routes that a set of homogeneous capacitated vehicles must perform in order to visit a set of customers such that the sum of the demands of the customers assigned to each vehicle does not exceed the capacity of the vehicle. For solving this problem three metaheuristic algorithms combining simulated annealing and variable neighborhood descent, and an iterated local search (ILS) algorithm, are proposed. Furthermore, the multi-depot cumulative capacitated vehicle routing problem (MDCCVRP) and the multi-depot k-traveling repairman problem (MDk-TRP) are solved with the proposed ILS algorithm. The MDCCVRP is a special case of the LLRP in which all the depots can be opened, and the MDk-TRP is a special case of the MDCCVRP in which the capacity constraints are relaxed. Finally, a LLRP with stochastic travel times is studied. A two-stage stochastic programming model and a variable neighborhood search algorithm are proposed for solving the problem. Furthermore a sampling method is developed for tackling instances with an infinite number of scenarios. Extensive computational experiments show that the proposed methods are effective for solving the problems under study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The pivotal role of spleen CD4(+) T cells in the development of both malaria pathogenesis and protective immunity makes necessary a profound comprehension of the mechanisms involved in their activation and regulation during Plasmodium infection. Herein, we examined in detail the behaviour of non-conventional and conventional splenic CD4(+) T cells during P. chabaudi malaria. We took advantage of the fact that a great proportion of CD4(+) T cells generated in CD1d(-/-) mice are I-A(b)-restricted (conventional cells), while their counterparts in I-Ab(-/-) mice are restricted by CD1d and other class IB major histocompatibility complex (MHC) molecules (non-conventional cells). We found that conventional CD4(+) T cells are the main protagonists of the immune response to infection, which develops in two consecutive phases concomitant with acute and chronic parasitaemias. The early phase of the conventional CD4(+) T cell response is intense and short lasting, rapidly providing large amounts of proinflammatory cytokines and helping follicular and marginal zone B cells to secrete polyclonal immunoglobulin. Both TNF-alpha and IFN-gamma production depend mostly on conventional CD4(+) T cells. IFN-gamma is produced simultaneously by non-conventional and conventional CD4(+) T cells. The early phase of the response finishes after a week of infection, with the elimination of a large proportion of CD4(+) T cells, which then gives opportunity to the development of acquired immunity. Unexpectedly, the major contribution of CD1d-restricted CD4(+) T cells occurs at the beginning of the second phase of the response, but not earlier, helping both IFN-gamma and parasite-specific antibody production. We concluded that conventional CD4(+) T cells have a central role from the onset of P. chabaudi malaria, acting in parallel with non-conventional CD4(+) T cells as a link between innate and acquired immunity. This study contributes to the understanding of malaria immunology and opens a perspective for future studies designed to decipher the molecular mechanisms behind immune responses to Plasmodium infection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: To retrospectively assess the influence of prophylactic cranial irradiation (PCI) timing on brain relapse rates in patients treated with two different chemoradiotherapy (CRT) regimens for Stage IIIB non-small-cell lung cancer (NSCLC). METHODS AND MATERIALS: A cohort of 134 patients, with Stage IIIB NSCLC in recursive partitioning analysis Group 1, was treated with PCI (30 Gy at 2 Gy/fr) following one of two CRT regimens. Regimen 1 (n = 58) consisted of three cycles of induction chemotherapy (ICT) followed by concurrent CRT (C-CRT). Regimen 2 (n = 76) consisted of immediate C-CRT during thoracic radiotherapy. RESULTS: At a median follow-up of 27.6 months (range, 7.2-40.4), 65 patients were alive. Median, progression-free, and brain metastasis-free survival (BMFS) times for the whole study cohort were 23.4, 15.4, and 23.0 months, respectively. Median survival time and the 3-year survival rate for regimens 1 and 2 were 19.3 vs. 26.1 months (p = 0.001) and 14.4% vs. 34.4% (p < .001), respectively. Median time from the initiation of primary treatment to PCI was 123.2 (range, 97-161) and 63.4 (range, 55-74) days for regimens 1 and 2, respectively (p < 0.001). Overall, 11 (8.2%) patients developed brain metastasis (BM) during the follow-up period: 8 (13.8%) in regimen 1 and 3 (3.9%) in regimen 2 (p = 0.03). Only 3 (2.2%) patients developed BM at the site of first failure, and for 2 of them, it was also the sole site of recurrence. Median BMFS for regimens 1 and 2 were 17.4 (13.5-21.3) vs. 26.0 (22.9-29.1 months), respectively (p < 0.001). CONCLUSION: These results suggest that in Stage IIIB NSCLC patients treated with PCI, lower BM incidence and longer survival rates result from immediate C-CRT rather than ITC-first regimens. This indicates the benefit of earlier PCI use without delay because of induction protocols.