901 resultados para two-stage sequential procedure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hardboard processing wastewater was evaluated as a feedstock in a bio refinery co-located with the hardboard facility for the production of fuel grade ethanol. A thorough characterization was conducted on the wastewater and the composition changes of which during the process in the bio refinery were tracked. It was determined that the wastewater had a low solid content (1.4%), and hemicellulose was the main component in the solid, accounting for up to 70%. Acid pretreatment alone can hydrolyze the majority of the hemicellulose as well as oligomers, and over 50% of the monomer sugars generated were xylose. The percentage of lignin remained in the liquid increased after acid pretreatment. The characterization results showed that hardboard processing wastewater is a feasible feedstock for the production of ethanol. The optimum conditions to hydrolyze hemicellulose into fermentable sugars were evaluated with a two-stage experiment, which includes acid pretreatment and enzymatic hydrolysis. The experimental data were fitted into second order regression models and Response Surface Methodology (RSM) was employed. The results of the experiment showed that for this type of feedstock enzymatic hydrolysis is not that necessary. In order to reach a comparatively high total sugar concentration (over 45g/l) and low furfural concentration (less than 0.5g/l), the optimum conditions were reached when acid concentration was between 1.41 to 1.81%, and reaction time was 48 to 76 minutes. The two products produced from the bio refinery were compared with traditional products, petroleum gasoline and traditional potassium acetate, in the perspective of sustainability, with greenhouse gas (GHG) emission as an indicator. Three allocation methods, system expansion, mass allocation and market value allocation methods were employed in this assessment. It was determined that the life cycle GHG emissions of ethanol were -27.1, 20.8 and 16 g CO2 eq/MJ, respectively, in the three allocation methods, whereas that of petroleum gasoline is 90 g CO2 eq/MJ. The life cycle GHG emissions of potassium acetate in mass allocation and market value allocation method were 555.7 and 716.0 g CO2 eq/kg, whereas that of traditional potassium acetate is 1020 g CO2/kg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research has found that children with autism spectrum disorders (ASD) show significant deficits in receptive language skills (Wiesmer, Lord, & Esler, 2010). One of the primary goals of applied behavior analytic intervention is to improve the communication skills of children with autism by teaching receptive discriminations. Both receptive discriminations and receptive language entail matching spoken words with corresponding objects, symbols (e.g., pictures or words), actions, people, and so on (Green, 2001). In order to develop receptive language skills, children with autism often undergo discrimination training within the context of discrete trial training. This training entails teaching the learner how to respond differentially to different stimuli (Green, 2001). It is through discrimination training that individuals with autism learn and develop language (Lovaas, 2003). The present study compares three procedures for teaching receptive discriminations: (1) simple/conditional (Procedure A), (2) conditional only (Procedure B), and (3) conditional discrimination of two target cards (Procedure C). Six children, ranging in age from 2-years-old to 5-years-old, with an autism diagnosis were taught how to receptively discriminate nine sets of stimuli. Results suggest that the extra training steps included in the simple/conditional and conditional only procedures may not be necessary to teach children with autism how to receptively discriminate. For all participants, Procedure C appeared to be the most efficient and effective procedure for teaching young children with autism receptive discriminations. Response maintenance and generalization probes conducted one-month following the end of training indicate that even though Procedure C resulted in less training sessions overall, no one procedure resulted in better maintenance and generalization than the others. In other words, more training sessions, as evident with the simple/conditional and conditional only procedures, did not facilitate participants’ ability to accurately respond or generalize one-month following training. The present study contributes to the literature on what is the most efficient and effective way to teach receptive discrimination during discrete trial training to children with ASD. These findings are critical as research shows that receptive language skills are predictive of better outcomes and adaptive behaviors in the future. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel approach to the determination of steroid entrapment in the bilayers of aerosolised liposomes has been introduced using high-sensitivity differential scanning calorimetry (DSC). Proliposomes were dispersed in water within an air-jet nebuliser and the energy produced during atomisation was used to hydrate the proliposomes and generate liposome aerosols. Proliposomes that included the steroid beclometasone dipropionate (BDP) produced lower aerosol and lipid outputs than steroid-free proliposomes. Size analysis and transmission electron microscopy showed an evidence of liposome formation within the nebuliser, which was followed by deaggregation and size reduction of multilamellar liposomes on nebulisation to a two-stage impinger. For each formulation, no difference in thermal transitions was observed between delivered liposomes and those remaining in the nebuliser. However, steroid (5 mole%) lowered the onset temperature and the enthalpy of the pretransition, and produced a similar onset temperature and larger enthalpy of the main transition, with broadened pretransition and main transitions. This indicates that BDP was entrapped and exhibited an interaction with the liposome phospholipid membranes. Since the pretransition was depressed but not completely removed and no phase separation occurred, it is suggested that the bilayers of the multilamellar liposomes can entrap more than 5 mole% BDP. Overall, liposomes were generated from proliposomes and DSC investigations indicated that the steroid was entrapped in the bilayers of aerosolised multilamellar vesicles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report a two-stage diode-pumped Er-doped fiber amplifier operating at the wavelength of 1550 nm at the repetition rate of 10-100 kHz with an average output power of up to 10 W. The first stage comprising Er-doped fiber was core-pumped at the wavelength of 1480 nm, whereas the second stage comprising double-clad Er/Yb-doped fiber was clad-pumped at the wavelength of 975 nm. The estimated peak power for the 0.4-nm full-width at half-maximum laser emission at the wavelength of 1550 nm exceeded 4-kW level. The initial 100-ns seed diode laser pulse was compressed to 3.5 ns as a result of the 34-dB total amplification. The observed 30-fold efficient pulse compression reveals a promising new nonlinear optical technique for the generation of high power short pulses for applications in eye-safe ranging and micromachining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master, Chemical Engineering) -- Queen's University, 2016-08-16 04:58:55.749

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates how textbook design may influence students’ visual attention to graphics, photos and text in current geography textbooks. Eye tracking, a visual method of data collection and analysis, was utilised to precisely monitor students’ eye movements while observing geography textbook spreads. In an exploratory study utilising random sampling, the eye movements of 20 students (secondary school students 15–17 years of age and university students 20–24 years of age) were recorded. The research entities were double-page spreads of current German geography textbooks covering an identical topic, taken from five separate textbooks. A two-stage test was developed. Each participant was given the task of first looking at the entire textbook spread to determine what was being explained on the pages. In the second stage, participants solved one of the tasks from the exercise section. Overall, each participant studied five different textbook spreads and completed five set tasks. After the eye tracking study, each participant completed a questionnaire. The results may verify textbook design as one crucial factor for successful knowledge acquisition from textbooks. Based on the eye tracking documentation, learning-related challenges posed by images and complex image-text structures in textbooks are elucidated and related to educational psychology insights and findings from visual communication and textbook analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single stage and two-stage sodium sulfite cooking were carried out on either spruce, pine or pure pine heartwood chips to investigate the influence of several process parameters on the initial phase of such a cook down to about 60 % pulp yield. The cooking experiments were carried out in the laboratory with either a lab-prepared or a mill-prepared cooking acid and the temperature and time were varied. The influences of dissolved organic and inorganic components in the cooking liquor on the final pulp composition and on the extent of side reactions were investigated. Kinetic equations were developed and the activation energies for delignification and carbohydrate dissolution were calculated using the Arrhenius equation. A better understanding of the delignification mechanisms during bisulfite and acid sulfite cooking was obtained by analyzing the lignin carbohydrate complexes (LCC) present in the pulp when different cooking conditions were used. It was found that using a mill-prepared cooking acid beneficial effect with respect to side reactions, extractives removal and higher stability in pH during the cook were observed compared to a lab-prepared cooking acid. However, no significant difference in degrees of delignification or carbohydrate degradation was seen.  The cellulose yield was not affected in the initial phase of the cook however; temperature had an influence on the rates of both delignification and hemicellulose removal. It was also found that the  corresponding activation energies increased in the order:  xylan, glucomannan, lignin and cellulose. The cooking temperature could thus be used to control the cook to a given carbohydrate composition in the final pulp. Lignin condensation reactions were observed during acid sulfite cooking, especially at higher temperatures. The LCC studies indicated the existence of covalent bonds between lignin and hemicellulose components with respect to xylan and glucomannan. LCC in native wood showed the presence of phenyl glycosides, ϒ-esters and α-ethers; whereas the α-ethers  were affected during sulfite pulping. The existence of covalent bonds between lignin and wood polysaccharides might be the rate-limiting factor in sulfite pulping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Three-Dimensional Single-Bin-Size Bin Packing Problem is one of the most studied problem in the Cutting & Packing category. From a strictly mathematical point of view, it consists of packing a finite set of strongly heterogeneous “small” boxes, called items, into a finite set of identical “large” rectangles, called bins, minimizing the unused volume and requiring that the items are packed without overlapping. The great interest is mainly due to the number of real-world applications in which it arises, such as pallet and container loading, cutting objects out of a piece of material and packaging design. Depending on these real-world applications, more objective functions and more practical constraints could be needed. After a brief discussion about the real-world applications of the problem and a exhaustive literature review, the design of a two-stage algorithm to solve the aforementioned problem is presented. The algorithm must be able to provide the spatial coordinates of the placed boxes vertices and also the optimal boxes input sequence, while guaranteeing geometric, stability, fragility constraints and a reduced computational time. Due to NP-hard complexity of this type of combinatorial problems, a fusion of metaheuristic and machine learning techniques is adopted. In particular, a hybrid genetic algorithm coupled with a feedforward neural network is used. In the first stage, a rich dataset is created starting from a set of real input instances provided by an industrial company and the feedforward neural network is trained on it. After its training, given a new input instance, the hybrid genetic algorithm is able to run using the neural network output as input parameter vector, providing as output the optimal solution. The effectiveness of the proposed works is confirmed via several experimental tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this PhD thesis a new firm level conditional risk measure is developed. It is named Joint Value at Risk (JVaR) and is defined as a quantile of a conditional distribution of interest, where the conditioning event is a latent upper tail event. It addresses the problem of how risk changes under extreme volatility scenarios. The properties of JVaR are studied based on a stochastic volatility representation of the underlying process. We prove that JVaR is leverage consistent, i.e. it is an increasing function of the dependence parameter in the stochastic representation. A feasible class of nonparametric M-estimators is introduced by exploiting the elicitability of quantiles and the stochastic ordering theory. Consistency and asymptotic normality of the two stage M-estimator are derived, and a simulation study is reported to illustrate its finite-sample properties. Parametric estimation methods are also discussed. The relation with the VaR is exploited to introduce a volatility contribution measure, and a tail risk measure is also proposed. The analysis of the dynamic JVaR is presented based on asymmetric stochastic volatility models. Empirical results with S&P500 data show that accounting for extreme volatility levels is relevant to better characterize the evolution of risk. The work is complemented by a review of the literature, where we provide an overview on quantile risk measures, elicitable functionals and several stochastic orderings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Latency can be defined as the sum of the arrival times at the customers. Minimum latency problems are specially relevant in applications related to humanitarian logistics. This thesis presents algorithms for solving a family of vehicle routing problems with minimum latency. First the latency location routing problem (LLRP) is considered. It consists of determining the subset of depots to be opened, and the routes that a set of homogeneous capacitated vehicles must perform in order to visit a set of customers such that the sum of the demands of the customers assigned to each vehicle does not exceed the capacity of the vehicle. For solving this problem three metaheuristic algorithms combining simulated annealing and variable neighborhood descent, and an iterated local search (ILS) algorithm, are proposed. Furthermore, the multi-depot cumulative capacitated vehicle routing problem (MDCCVRP) and the multi-depot k-traveling repairman problem (MDk-TRP) are solved with the proposed ILS algorithm. The MDCCVRP is a special case of the LLRP in which all the depots can be opened, and the MDk-TRP is a special case of the MDCCVRP in which the capacity constraints are relaxed. Finally, a LLRP with stochastic travel times is studied. A two-stage stochastic programming model and a variable neighborhood search algorithm are proposed for solving the problem. Furthermore a sampling method is developed for tackling instances with an infinite number of scenarios. Extensive computational experiments show that the proposed methods are effective for solving the problems under study.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Purpose: Letrozole (LET) has recently been shown to be superior to tamoxifen for postmenopausal patients (pts). In addition, LET radiosensitizes breast cancer cells in vitro. We conducted a phase II randomized study to evaluate concurrent and sequential radiotherapy (RT)-LET in the adjuvant setting. We present here clinical results with a minimum follow-up of 24 months. Patients and Methods: Postmenopausal pts with early-stage breast cancer were randomized after conservative surgery to either: A) concurrent RT-LET (LET started 3 weeks before the first day of RT) or B) sequential RT-LET (LET started 3 weeks after the end of RT). Whole breast RT was delivered to a total dose of 50 Gy. A 10-16 Gy boost was allowed according to age and pathological prognostic factors. Pts were stratified by center, adjuvant chemotherapy, boost, and radiation-induced CD8 apoptosis (RILA). RILA was performed before RT as previously published (Ozsahin et al. Clin Cancer Res, 2005). An independent monitoring committee reviewed individual safety data. Skin toxicities were evaluated by two different clinicians at each medical visit (CTCAE v3.0). Lung CT-scan and functional pulmonary tests were performed regularly. DNA samples were screened for SNPs in candidate genes as recently published (Azria et al., Clin Cancer Res, 2008). Results: A total of 150 pts were randomized between 01/05 and 02/07. Median follow-up is 26 months (range, 3-40 months). No statistical differences were identified between the two arms in terms of mean age; initial TNM; median surgical bed volume; post surgical breast volume. Chemotherapy and RT boost were delivered in 19% and 38% of pts, respectively. Nodes received 50 Gy in 23% of patients without differences between both arms. During RT and within the first 6 weeks after RT, 10 patients (6.7%) presented grade 3 acute skin dermatitis during RT but no differences were observed between both arms (4 and 6 patients in arm A and B, respectively). At 26 month of follow-up, grade 2 and more radiation-induced subcutaneous fibrosis (RISCF) was present in 4 patients (3%) without any difference between arm A (n = 2) and B (n = 2), p=0.93. In both arms, all patients that presented a RICSF had a RILA lower than 16%. Sensitivity and specificity were 100% and 39%, respectively.No acute lung toxicities were observed and quality of life was good to excellent for all patients.SNPs analyses are still on-going (Pr Rosenstein, NY). Conclusion: Acute and early late grade 2 dermatitis were similar in both arms. The only factor that influenced RISCF was a low radiation-induced lymphocyte apoptosis yield. We confirmed prospectively the capacity of RILA for identifying hypersensitive patients to radiation. Indeed, patients with RILA superior to 16% did not present late effects to radiation and confirmed the first prospective trial we published in 2005 (Ozsahin et al., Clin Cancer Res).

Relevância:

50.00% 50.00%

Publicador:

Resumo:

BACKGROUND: Letrozole radiosensitises breast cancer cells in vitro. In clinical settings, no data exist for the combination of letrozole and radiotherapy. We assessed concurrent and sequential radiotherapy and letrozole in the adjuvant setting. METHODS: This phase 2 randomised trial was undertaken in two centres in France and one in Switzerland between Jan 12, 2005, and Feb 21, 2007. 150 postmenopausal women with early-stage breast cancer were randomly assigned after conserving surgery to either concurrent radiotherapy and letrozole (n=75) or sequential radiotherapy and letrozole (n=75). Randomisation was open label with a minimisation technique, stratified by investigational centres, chemotherapy (yes vs no), radiation boost (yes vs no), and value of radiation-induced lymphocyte apoptosis (< or = 16% vs >16%). Whole breast was irradiated to a total dose of 50 Gy in 25 fractions over 5 weeks. In the case of supraclavicular and internal mammary node irradiation, the dose was 44-50 Gy. Letrozole was administered orally once daily at a dose of 2.5 mg for 5 years (beginning 3 weeks pre-radiotherapy in the concomitant group, and 3 weeks post-radiotherapy in the sequential group). The primary endpoint was the occurrence of acute (during and within 6 weeks of radiotherapy) and late (within 2 years) radiation-induced grade 2 or worse toxic effects of the skin. Analyses were by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00208273. FINDINGS: All patients were analysed apart from one in the concurrent group who withdrew consent before any treatment. During radiotherapy and within the first 12 weeks after radiotherapy, 31 patients in the concurrent group and 31 in the sequential group had any grade 2 or worse skin-related toxicity. The most common skin-related adverse event was dermatitis: four patients in the concurrent group and six in the sequential group had grade 3 acute skin dermatitis during radiotherapy. At a median follow-up of 26 months (range 3-40), two patients in each group had grade 2 or worse late effects (both radiation-induced subcutaneous fibrosis). INTERPRETATION: Letrozole can be safely delivered shortly after surgery and concomitantly with radiotherapy. Long-term follow-up is needed to investigate cardiac side-effects and cancer-specific outcomes. FUNDING: Novartis Oncology France.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Purpose/Objective(s): Letrozole radiosensitizes breast cancer cells in vitro. In clinical settings, no data exist for the combination of letrozole and radiotherapy. We assessed concurrent and sequential radiotherapy and letrozole in the adjuvant setting.Materials/Methods: The present study is registered with ClinicalTrials.gov, number NCT00208273. This Phase 2 randomized trial was undertaken in two centers in France and one in Switzerland between January 12, 2005, and February 21, 2007. One hundred fifty postmenopausal women with early-stage breast cancer were randomly assigned after conserving surgery to either concurrent radiotherapy and letrozole (n = 75) or sequential radiotherapy and letrozole (n = 75). Randomization was open label with a minimization technique, stratified by investigational centers, chemotherapy (yes vs. no), radiation boost (yes vs. no), and value of radiation-induced lymphocyte apoptosis (#16% vs. .16%). The whole breast was irradiated to a total dose of 50 Gy in 25 fractions over 5 weeks. In the case of supraclavicular and internal mammary node irradiation, the dose was 44 - 50 Gy. Letrozole was administered orally once daily at a dose of 2 - 5 mg for 5 years (beginning 3 weeks pre-radiotherapy in the concomitant group, and 3 weeks postradiotherapy in the sequential group). The primary endpoint was the occurrence of acute (during and within 6 weeks of radiotherapy) and late (within 2 years) radiation-induced Grade 2 or worse toxic effects of the skin and lung (functional pulmonary test and lung CT-scan). Analyses were by intention-to-treat. The long-term follow-up after 2 years was only performed in Montpellier (n = 121) and evaluated skin toxicity (clinical examination every 6 months), lung fibrosis (one CT-scan yearly), cosmetic outcome.Results: All patients were analyzed apart from 1 in the concurrent group who withdrew consent before any treatment.Within the first 2 years (n = 149), no lung toxicity was identified by CT scan and no modification from baseline was noted by the lung diffusion capacity test. Two patients in each group had Grade 2 or worse late effects (both radiation-induced subcutaneous fibrosis [RISF]). After 2 years (n = 121), and with a median follow-up of 50 months (38-62), 2 patients (1 in each arm) presented a Grade 3 RISF. No lung toxicity was identified by CT scan. Cosmetic results (photographies) and quality of life was good to excellent. All patients who had Grade 3 subcutaneous fibrosis had an RILA value of 16% or less, irrespective of the sequence with letrozole.Conclusions:With long-term follow-up, letrozole can be safely delivered shortly after surgery and concomitantly with radiotherapy.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Two voters must choose between two alternatives. Voters vote in a fixed linear order. If there is not unanimity for any alternative, the procedure is repeated. At every stage, each voter prefers the same alternative to the other, has utilities decreasing with stages, and has an impatience degree representing when it is worth voting for the non-preferred alternative now rather than waiting for the next stage and voting for the preferred alternative. Intuition suggests that the more patient voter will get his preferred alternative. I found that in the unique solution of the sequential voting procedure obtained by backward induction, the first voter get his preferred alternative at the first stage independently from his impatience rate. Keywords: sequential voting, impatience rate, multi-stage voting, unanimity