121 resultados para linear production set

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

beta-Galactosidase (beta-Gal) activity is a widely accepted biomarker to detect senescence both in situ and in vitro. A cytochemical assay based on production of a blue-dyed precipitate that results from the cleavage of the chromogenic substrate X-Gal is commonly used. Blue and nonblue cells are counted under the microscope and a semiquantitative percentage of senescent cells can be obtained. Here, we present a quantitative, fast, and easy to use chemiluminescent assay to detect senescence. The Galacton chemiluminescent method used to detect the prokaryotic beta-Gal reporter enzyme in transfection studies was adapted to assay mammalian beta-Gal. The assay showed linear production of luminescence in a time- and cell-number-dependent manner. The chemiluminescent assay showed significant correlation with the cytochemical assay in detecting replicative senescence (Pearson r = 0.8486, p < 0.005). Moreover, the chemiluminescent method (Galacton) also detected stress-induced senescence in cells treated with H2O2 similar to the cytochemical assay (X-Gal) (Galacton: control 25.207.3 +/- 6548.6. H2O, 52,487.4 +/- 16,284.9, p < 0.05; X-Gal: control 41.31 +/- 7.0%, H2O2 92.97 +/- 2.8%, p < 0.01). Thus, our method is well suited to the detection of replicative and stress-induced senescence in cell culture. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We explored possible effects of negative covariation among finger forces in multifinger accurate force production tasks on the classical Fitts's speed-accuracy trade-off. Healthy subjects performed cyclic force changes between pairs of targets ""as quickly and accurately as possible."" Tasks with two force amplitudes and six ratics of force amplitude to target size were performed by each of the four fingers of the right hand and four finger combinations. There was a close to linear relation between movement time and the log-transformed ratio of target amplitude to target size across all finger combinations. There was a close to linear relation between standard deviation of force amplitude and movement time. There were no differences between the performance of either of the two ""radial"" fingers (index and middle) and the multifinger tasks. The ""ulnar"" fingers (little and ring) showed higher indices of variability and longer movement times as compared with both ""radial"" fingers and multifinger combinations. We conclude that potential effects of the negative covariation and also of the task-sharing across a set of fingers are counterbalanced by an increase in individual finger force variability in multifinger tasks as compared with single-finger tasks. The results speak in favor of a feed-forward model of multifinger synergies. They corroborate a hypothesis that multifinger synergies are created not to improve overall accuracy, but to allow the system larger flexibility, for example to deal with unexpected perturbations and concomitant tasks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It was evaluated the effects of levels of digestible lysine and chelate zinc combined in the diet for laying on the egg quality. It was used 720 birds, from 48 to 60 weeks of age, distributed in a completely randomized design in a 3 × 5 factorial scheme with three levels of zinc and five levels of lysine, applied into six replicates in the experimental units of eight birds per plot. The levels were: 137, 309 and 655 ppm zinc and 0.482, 0.527, 0.582, 0.644 and 0.732% digestible lysine. It was not observed any interaction among digestible lysine and zinc for the primary variables of fractions and egg composition. Levels of zinc reduced egg weight, suggesting the lowest efficiency in nutrient intake. At the highest dietary concentration of zinc, the addition of digestible lysine coincided with a linear increase in shell weight. However, zinc addition, regardless of lysine level in the diet, resulted in the reduction of egg weight and of the percentage of mineral matter in the yolk, limiting the efficiency of mineral deposition in this fraction of the egg. Concentration of zinc that produced the best results was 137 ppm inasmuch as higher quantities limit the use of digestible lysine, with effects harming composition and egg quality. The study indicates the following requirement for digestible lysine: 0.639% from the 48th to the 52nd week, 0.679% from the 52nd to 56th week, and 0.635% from the 56th to the 60th week. Considering the total period from 48th to the 60th week, the level 0.638% of lysine or the daily intake of 707 mg of the amino acid met the requirement for egg quality of semi-heavy layers used in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The HACCP system is being increasingly used to ensure food safety. This study investigated the validation of the control measures technique in order to establish performance indicators of this HACCP system in the manufacturing process of Lasagna Bolognese (meat lasagna). Samples were collected along the manufacturing process as a whole, before and after the CCPs. The following microorganism s indicator (MIs) was assessed: total mesophile and faecal coliform counts. The same MIs were analyzed in the final product, as well as, the microbiological standards required by the current legislation. A significant reduction in the total mesophile count was observed after cooking (p < 0.001). After storage, there was a numerical, however non-significant change in the MI count. Faecal coliform counts were also significantly reduced (p < 0.001) after cooking. We were able to demonstrate that the HACCP system allowed us to meet the standards set by both, the company and the Brazilian regulations, proved by the reduction in the established indicators

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a contribution to the Large-Scale Biosphere-Atmosphere Experiment in Amazonia - Cooperative LBA Airborne Regional Experiment (LBA-CLAIRE-2001) field campaign in the heart of the Amazon Basin, we analyzed the temporal and spatial dynamics of the urban plume of Manaus City during the wet-to-dry season transition period in July 2001. During the flights, we performed vertical stacks of crosswind transects in the urban outflow downwind of Manaus City, measuring a comprehensive set of trace constituents including O(3), NO, NO(2), CO, VOC, CO(2), and H(2)O. Aerosol loads were characterized by concentrations of total aerosol number (CN) and cloud condensation nuclei (CCN), and by light scattering properties. Measurements over pristine rainforest areas during the campaign showed low levels of pollution from biomass burning or industrial emissions, representative of wet season background conditions. The urban plume of Manaus City was found to be joined by plumes from power plants south of the city, all showing evidence of very strong photochemical ozone formation. One episode is discussed in detail, where a threefold increase in ozone mixing ratios within the atmospheric boundary layer occurred within a 100 km travel distance downwind of Manaus. Observation-based estimates of the ozone production rates in the plume reached 15 ppb h(-1). Within the plume core, aerosol concentrations were strongly enhanced, with Delta CN/Delta CO ratios about one order of magnitude higher than observed in Amazon biomass burning plumes. Delta CN/Delta CO ratios tended to decrease with increasing transport time, indicative of a significant reduction in particle number by coagulation, and without substantial new particle nucleation occurring within the time/space observed. While in the background atmosphere a large fraction of the total particle number served as CCN (about 60-80% at 0.6% supersaturation), the CCN/CN ratios within the plume indicated that only a small fraction (16 +/- 12 %) of the plume particles were CCN. The fresh plume aerosols showed relatively weak light scattering efficiency. The CO-normalized CCN concentrations and light scattering coefficients increased with plume age in most cases, suggesting particle growth by condensation of soluble organic or inorganic species. We used a Single Column Chemistry and Transport Model (SCM) to infer the urban pollution emission fluxes of Manaus City, implying observed mixing ratios of CO, NO(x) and VOC. The model can reproduce the temporal/spatial distribution of ozone enhancements in the Manaus plume, both with and without accounting for the distinct (high NO(x)) contribution by the power plants; this way examining the sensitivity of ozone production to changes in the emission rates of NO(x). The VOC reactivity in the Manaus region was dominated by a high burden of biogenic isoprene from the background rainforest atmosphere, and therefore NO(x) control is assumed to be the most effective ozone abatement strategy. Both observations and models show that the agglomeration of NO(x) emission sources, like power plants, in a well-arranged area can decrease the ozone production efficiency in the near field of the urban populated cores. But on the other hand remote areas downwind of the city then bear the brunt, being exposed to increased ozone production and N-deposition. The simulated maximum stomatal ozone uptake fluxes were 4 nmol m(-2) s(-1) close to Manaus, and decreased only to about 2 nmol m(-2) s(-1) within a travel distance >1500 km downwind from Manaus, clearly exceeding the critical threshold level for broadleaf trees. Likewise, the simulated N deposition close to Manaus was similar to 70 kg N ha(-1) a(-1) decreasing only to about 30 kg N ha(-1) a(-1) after three days of simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of the azimuthal anisotropy of high-p(T) neutral pion (pi(0)) production in Au+Au collisions at s(NN)=200 GeV by the PHENIX experiment are presented. The data included in this article were collected during the 2004 Relativistic Heavy Ion Collider running period and represent approximately an order of magnitude increase in the number of analyzed events relative to previously published results. Azimuthal angle distributions of pi(0) mesons detected in the PHENIX electromagnetic calorimeters are measured relative to the reaction plane determined event-by-event using the forward and backward beam-beam counters. Amplitudes of the second Fourier component (v(2)) of the angular distributions are presented as a function of pi(0) transverse momentum (p(T)) for different bins in collision centrality. Measured reaction plane dependent pi(0) yields are used to determine the azimuthal dependence of the pi(0) suppression as a function of p(T), R(AA)(Delta phi,p(T)). A jet-quenching motivated geometric analysis is presented that attempts to simultaneously describe the centrality dependence and reaction plane angle dependence of the pi(0) suppression in terms of the path lengths of hypothetical parent partons in the medium. This set of results allows for a detailed examination of the influence of geometry in the collision region and of the interplay between collective flow and jet-quenching effects along the azimuthal axis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a non-linear boundary element formulation applied to analysis of contact problems. The boundary element method (BEM) is known as a robust and accurate numerical technique to handle this type of problem, because the contact among the solids occurs along their boundaries. The proposed non-linear formulation is based on the use of singular or hyper-singular integral equations by BEM, for multi-region contact. When the contact occurs between crack surfaces, the formulation adopted is the dual version of BEM, in which singular and hyper-singular integral equations are defined along the opposite sides of the contact boundaries. The structural non-linear behaviour on the contact is considered using Coulomb`s friction law. The non-linear formulation is based on the tangent operator in which one uses the derivate of the set of algebraic equations to construct the corrections for the non-linear process. This implicit formulation has shown accurate as the classical approach, however, it is faster to compute the solution. Examples of simple and multi-region contact problems are shown to illustrate the applicability of the proposed scheme. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the representation of judgements of stochastic independence in probabilistic logics. We focus on a relational logic where (i) judgements of stochastic independence are encoded by directed acyclic graphs, and (ii) probabilistic assessments are flexible in the sense that they are not required to specify a single probability measure. We discuss issues of knowledge representation and inference that arise from our particular combination of graphs, stochastic independence, logical formulas and probabilistic assessments. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here, we study the stable integration of real time optimization (RTO) with model predictive control (MPC) in a three layer structure. The intermediate layer is a quadratic programming whose objective is to compute reachable targets to the MPC layer that lie at the minimum distance to the optimum set points that are produced by the RTO layer. The lower layer is an infinite horizon MPC with guaranteed stability with additional constraints that force the feasibility and convergence of the target calculation layer. It is also considered the case in which there is polytopic uncertainty in the steady state model considered in the target calculation. The dynamic part of the MPC model is also considered unknown but it is assumed to be represented by one of the models of a discrete set of models. The efficiency of the methods presented here is illustrated with the simulation of a low order system. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new concept and a preliminary study for a monocolumn floating unit are introduced, aimed at exploring and producing oil in ultradeep waters. This platform, which combines two relevant features-great oil storage capacity and dry tree production capability-comprises two bodies with relatively independent heave motions between them. A parametric model is used to define the main design characteristics of the floating units. A set of design alternatives is generated using this procedure. These solutions are evaluated in terms of stability requirements and dynamic response. A mathematical model is developed to estimate the first order heave and pitch motions of the platform. Experimental tests are carried out in order to calibrate this model. The response of each body alone is estimated numerically using the WAMIT (R) code. This paper also includes a preliminary study on the platform mooring system and appendages. The study of the heave plates presents the gain, in terms of decreasing the motions, achieved by the introduction of the appropriate appendages to the platform. [DOI: 10.1115/1.4001429]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of sustainability concerns in industrial forests management plans, in relation to environmental, social and economic aspects, involve a great amount of details when analyzing and understanding the interaction among these aspects to reduce possible future impacts. At the tactical and operational planning levels, methods based on generic assumptions usually provide non-realistic solutions, impairing the decision making process. This study is aimed at improving current operational harvesting planning techniques, through the development of a mixed integer goal programming model. This allows the evaluation of different scenarios, subject to environmental and supply constraints, increase of operational capacity, and the spatial consequences of dispatching harvest crews to certain distances over the evaluation period. As a result, a set of performance indicators was selected to evaluate all optimal solutions provided to different possible scenarios and combinations of these scenarios, and to compare these outcomes with the real results observed by the mill in the study case area. Results showed that it is possible to elaborate a linear programming model that adequately represents harvesting limitations, production aspects and environmental and supply constraints. The comparison involving the evaluated scenarios and the real observed results showed the advantage of using more holistic approaches and that it is possible to improve the quality of the planning recommendations using linear programming techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scope of this research work was to investigate biogas production and purification by a two-step bench-scale biological system, consisting of fed-batch pulse-feeding anaerobic digestion of mixed sludge, followed by methane enrichment of biogas by the use of the cyanobacterium Arthrospira platensis. The composition of biogas was nearly constant, and methane and carbon dioxide percentages ranged between 70.5-76.0% and 13.2-19.5%, respectively. Biogas yield reached a maximum value (about 0.4 m(biogas)(3)/kgCOD(i)) at 50 days-retention time and then gradually decreased with a decrease in the retention time. Biogas CO(2) was then used as a carbon source for A. platensis cultivation either under batch or fed-batch conditions. The mean cell productivity of fed-batch cultivation was about 15% higher than that observed during the last batch phase (0.035 +/- 0.006 g(DM)/L/d), likely due to the occurrence of some shading effect under batch growth conditions. The data of carbon dioxide removal from biogas revealed the existence of a linear relationship between the rates of A. platensis growth and carbon dioxide removal from biogas and allowed calculating carbon utilization efficiency for biomass production of almost 95%. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study was conducted to verify whether the theory on the evolution of corporate environmental management (CEM) is applicable to organizations located in Brazil. Some of the most important proposals pertaining to the evolution of CEM were evaluated in a systematic fashion and integrated into a typical theoretical framework containing three evolutionary stages: reactive, preventive and proactive. The validity of this framework was tested by surveying 94 companies located in Brazil with ISO 14001 certification. Results indicated that the evolution of CEM tends to occur in a manner that is counter to what has generally been described in the literature. Two evolutionary stages were identified: 1) synergy for eco-efficiency and 2) environmental legislation view, which combine variables that were initially categorized into different theoretical CEM stages. These data, obtained from a direct study of Brazilian companies, suggest that the evolution of environmental management in organizations tends to occur in a non-linear fashion, requiring a re-analysis of traditional perceptions of CEM development, as suggested by Kolk and Mauser (2002). (C) 2010 Elsevier Ltd. All rights reserved.