12 resultados para combined stage sintering model

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Amazon Basin is crucial to global circulatory and carbon patterns due to the large areal extent and large flux magnitude. Biogeophysical models have had difficulty reproducing the annual cycle of net ecosystem exchange (NEE) of carbon in some regions of the Amazon, generally simulating uptake during the wet season and efflux during seasonal drought. In reality, the opposite occurs. Observational and modeling studies have identified several mechanisms that explain the observed annual cycle, including: (1) deep soil columns that can store large water amount, (2) the ability of deep roots to access moisture at depth when near-surface soil dries during annual drought, (3) movement of water in the soil via hydraulic redistribution, allowing for more efficient uptake of water during the wet season, and moistening of near-surface soil during the annual drought, and (4) photosynthetic response to elevated light levels as cloudiness decreases during the dry season. We incorporate these mechanisms into the third version of the Simple Biosphere model (SiB3) both singly and collectively, and confront the results with observations. For the forest to maintain function through seasonal drought, there must be sufficient water storage in the soil to sustain transpiration through the dry season in addition to the ability of the roots to access the stored water. We find that individually, none of these mechanisms by themselves produces a simulation of the annual cycle of NEE that matches the observed. When these mechanisms are combined into the model, NEE follows the general trend of the observations, showing efflux during the wet season and uptake during seasonal drought.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The design of translation invariant and locally defined binary image operators over large windows is made difficult by decreased statistical precision and increased training time. We present a complete framework for the application of stacked design, a recently proposed technique to create two-stage operators that circumvents that difficulty. We propose a novel algorithm, based on Information Theory, to find groups of pixels that should be used together to predict the Output Value. We employ this algorithm to automate the process of creating a set of first-level operators that are later combined in a global operator. We also propose a principled way to guide this combination, by using feature selection and model comparison. Experimental results Show that the proposed framework leads to better results than single stage design. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: (1) To compare the anatomopathological variables and recurrence rates in patients with early-stage adenocarcinoma (AC) and squamous cell carcinoma (SCC) of the uterine cervix; (2) to identify the independent risk factors for recurrence. Study design: This historical cohort study assessed 238 patients with carcinoma of the uterine cervix (113 and IIA), who underwent radical hysterectomy with pelvic lymph node dissection between 1980 and 1999. Comparison of category variables between the two histological types was carried out using the Pearson`s X-2 test or Fisher exact test. Disease-free survival rates for AC and SCC were calculated using the Kaplan-Meier method and the curves were compared using the log-rank test. The Cox proportional hazards model was used to identify the independent risk factors for recurrence. Results: There were 35 cases of AC (14.7%) and 203 of SCC (85.3%). AC presented lower histological grade than did SCC (grade 1: 68.6% versus 9.4%; p < 0.001), lower rate of lymphovascular space involvement (25.7% versus 53.7%; p = 0.002), lower rate of invasion into the middle or deep thirds of the uterine cervix (40.0% versus 80.8%; p < 0.001) and lower rate of lymph node metastasis (2.9% versus 16.3%; p = 0.036). Although the recurrence rate was lower for AC than for SCC (11.4% versus 15.8%), this difference was not statistically significant (p = 0.509). Multivariate analysis identified three independent risk factors for recurrence: presence of metastases in the pelvic lymph nodes, invasion of the deep third of the uterine cervix and absence of or slight inflammatory reaction in the cervix. When these variables were adjusted for the histological type and radiotherapy status, they remained in the model as independent risk factors. Conclusion: The AC group showed less aggressive histological behavior than did the SCC group, but no difference in the disease-free survival rates was noted. (C) 2006 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oocyte developmental competence depends on maternal stores that support development throughout a transcriptionally silent period during early embryogenesis. Previous attempts to investigate transcripts associated with oocyte competence have relied on prospective models, which are mostly based on morphological. criteria. Using a retrospective model, we quantitatively compared mRNA among oocytes with different embryo development competence. A cytoplasm biopsy was removed from in vitro matured oocytes to perform comparative analysis of amounts of global polyadenylated (polyA) mRNA and housekeeping gene transcripts. After parthenogenetic activation of biopsied oocytes, presumptive zygotes were cultured individually in vitro and oocytes were classified according to embryo development: (i) blocked before the 8-cell stage; (ii) blocked between the 8-cell and morulae stages; or (iii) developed to the blastocyst stage. Sham-manipulated controls confirmed that biopsies did not alter development outcome. Total polyA mRNA amounts correlate with oocyte diameter but not with the ability to develop to the 8-cell and blastocyst stages. The last was also confirmed by relative quantification of GAPDH, H2A and Hprt1 transcripts. In conclusion, we describe a novel retrospective model to identify putative markers of development competence in single oocytes and demonstrate that global mRNA amounts at the metaphase II stage do not correlate with embryo development in vitro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a spectrum synthesis method for modeling the ultraviolet (UV) emission from the accretion disk from cataclysmic variables (CVs). The disk is separated into concentric rings, with an internal structure from the Wade & Hubeny disk-atmosphere models. For each ring, a wind atmosphere is calculated in the comoving frame with a vertical velocity structure obtained from a solution of the Euler equation. Using simple assumptions, regarding rotation and the wind streamlines, these one-dimensional models are combined into a single 2.5-dimensional model for which we compute synthetic spectra. We find that the resulting line and continuum behavior as a function of the orbital inclination is consistent with the observations, and verify that the accretion rate affects the wind temperature, leading to corresponding trends in the intensity of UV lines. In general, we also find that the primary mass has a strong effect on the P Cygni absorption profiles, the synthetic emission line profiles are strongly sensitive to the wind temperature structure, and an increase in the mass-loss rate enhances the resonance line intensities. Synthetic spectra were compared with UV data for two high orbital inclination nova-like CVs-RW Tri and V347 Pup. We needed to include disk regions with arbitrary enhanced mass loss to reproduce reasonably well widths and line profiles. This fact and a lack of flux in some high ionization lines may be the signature of the presence of density-enhanced regions in the wind, or alternatively, may result from inadequacies in some of our simplifying assumptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new accelerating cosmology driven only by baryons plus cold dark matter (CDM) is proposed in the framework of general relativity. In this scenario the present accelerating stage of the Universe is powered by the negative pressure describing the gravitationally-induced particle production of cold dark matter particles. This kind of scenario has only one free parameter and the differential equation governing the evolution of the scale factor is exactly the same of the Lambda CDM model. For a spatially flat Universe, as predicted by inflation (Omega(dm) + Omega(baryon) = 1), it is found that the effectively observed matter density parameter is Omega(meff) = 1 - alpha, where alpha is the constant parameter specifying the CDM particle creation rate. The supernovae test based on the Union data (2008) requires alpha similar to 0.71 so that Omega(meff) similar to 0.29 as independently derived from weak gravitational lensing, the large scale structure and other complementary observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phylogenetic analyses of chloroplast DNA sequences, morphology, and combined data have provided consistent support for many of the major branches within the angiosperm, clade Dipsacales. Here we use sequences from three mitochondrial loci to test the existing broad scale phylogeny and in an attempt to resolve several relationships that have remained uncertain. Parsimony, maximum likelihood, and Bayesian analyses of a combined mitochondrial data set recover trees broadly consistent with previous studies, although resolution and support are lower than in the largest chloroplast analyses. Combining chloroplast and mitochondrial data results in a generally well-resolved and very strongly supported topology but the previously recognized problem areas remain. To investigate why these relationships have been difficult to resolve we conducted a series of experiments using different data partitions and heterogeneous substitution models. Usually more complex modeling schemes are favored regardless of the partitions recognized but model choice had little effect on topology or support values. In contrast there are consistent but weakly supported differences in the topologies recovered from coding and non-coding matrices. These conflicts directly correspond to relationships that were poorly resolved in analyses of the full combined chloroplast-mitochondrial data set. We suggest incongruent signal has contributed to our inability to confidently resolve these problem areas. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function lambda(t), t >= 0. This rate function also depends on some parameters that need to be estimated. Two forms of lambda(t), t >= 0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictors of random effects are usually based on the popular mixed effects (ME) model developed under the assumption that the sample is obtained from a conceptual infinite population; such predictors are employed even when the actual population is finite. Two alternatives that incorporate the finite nature of the population are obtained from the superpopulation model proposed by Scott and Smith (1969. Estimation in multi-stage surveys. J. Amer. Statist. Assoc. 64, 830-840) or from the finite population mixed model recently proposed by Stanek and Singer (2004. Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 1119-1130). Predictors derived under the latter model with the additional assumptions that all variance components are known and that within-cluster variances are equal have smaller mean squared error (MSE) than the competitors based on either the ME or Scott and Smith`s models. As population variances are rarely known, we propose method of moment estimators to obtain empirical predictors and conduct a simulation study to evaluate their performance. The results suggest that the finite population mixed model empirical predictor is more stable than its competitors since, in terms of MSE, it is either the best or the second best and when second best, its performance lies within acceptable limits. When both cluster and unit intra-class correlation coefficients are very high (e.g., 0.95 or more), the performance of the empirical predictors derived under the three models is similar. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Schistosomiasis affects more than 200 million people worldwide; another 600 million are at risk of infection. The schistosomulum stage is believed to be the target of protective immunity in the attenuated cercaria vaccine model. In an attempt to identify genes up-regulated in the schistosomulum stage in relation to cercaria, we explored the Schistosoma mansoni transcriptome by looking at the relative frequency of reads in EST libraries from both stages. The 400 genes potentially up-regulated in schistosomula were analyzed as to their Gene Ontology categorization, and we have focused on those encoding-predicted proteins with no similarity to proteins of other organisms, assuming they could be parasite-specific proteins important for survival in the host. Up-regulation in schistosomulum relative to cercaria was validated with real-time reverse transcription polymerase chain reaction (RT-PCR) for five out of nine selected genes (56%). We tested their protective potential in mice through immunization with DNA vaccines followed by a parasite challenge. Worm burden reductions of 16-17% were observed for one of them, indicating its protective potential. Our results demonstrate the value and caveats of using stage-associated frequency of ESTs as an indication of differential expression coupled to DNA vaccine screening in the identification of novel proteins to be further investigated as potential vaccine candidates.