903 resultados para Time Based Media
Resumo:
Dans les milieux contaminés par les métaux, les organismes vivants sont exposés à plusieurs d’entre eux en même temps. Les modèles courants de prédiction des effets biologiques des métaux sur les organismes (p. ex., modèle du ligand biotique, BLM ; modèle de l’ion libre, FIAM), sont des modèles d’équilibre chimique qui prévoient, en présence d'un deuxième métal, une diminution de la bioaccumulation du métal d’intérêt et par la suite une atténuation de ses effets. Les biomarqueurs de toxicité, tels que les phytochélatines (PCs), ont été utilisés comme étant un moyen alternatif pour l’évaluation des effets biologiques. Les phytochélatines sont des polypeptides riches en cystéine dont la structure générale est (γ-glu-cys)n-Gly où n varie de 2 à 11. Leur synthèse semble dépendante de la concentration des ions métalliques ainsi que de la durée de l’ exposition de l’organisme, aux métaux. L'objectif de cette étude était donc de déterminer, dans les mélanges binaires de métaux, la possibilité de prédiction de la synthèse des phytochélatines par les modèles d’équilibres chimiques, tel que le BLM. Pour cela, la quantité de phytochélatines produites en réponse d’une exposition aux mélanges binaires : Cd-Ca, Cd-Cu et Cd-Pb a été mesurée tout en surveillant l’effet direct de la compétition par le biais des concentrations de métaux internalisés. En effet, après six heures d’exposition, la bioaccumulation de Cd diminue en présence du Ca et de très fortes concentrations de Pb et de Cu (de l’ordre de 5×10-6 M). Par contre, avec des concentrations modérées de ces deux métaux, le Cd augmente en présence de Cu et ne semble pas affecté par la présence de Pb. Dans le cas de la compétition Cd-Cu, une bonne corrélation a été observée entre la production de PC2, PC3 et PC4 et la quantité des métaux bioaccumulés. Pour la synthèse des phytochélatines et la bioaccumulation, les effets étaient considérés comme synergiques. Dans le cas du Cd-Ca, les quantités de PC3 et PC4 ont diminué avec le métal internalisé (effet antagoniste), mais ce qui était remarquable était la grande quantité de cystéine (GSH) et PC2 qui ont été produites à de fortes concentrations du Ca. Le Pb seul n’a pas induit les PCs. Par conséquent, il n’y avait pas de variation de la quantité de PCs avec la concentration de Pb à laquelle les algues ont été exposées. La détection et la quantification des PCs ont été faites par chromatographie à haute performance couplée d’un détecteur de fluorescence (HPLC-FL). Tandis que les concentrations métalliques intracellulaires ont été analysées par spectroscopie d’absorption atomique (AAS) ou par spectrométrie de masse à source plasma à couplage inductif (ICP-MS).
Resumo:
Les résultats ont été obtenus avec le logiciel "Insight-2" de Accelris (San Diego, CA)
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced
Resumo:
La Cooperación Internacional para el desarrollo se ha caracterizado por una evolución constante a lo largo de las últimas tres décadas. Las bases sobre las cuales se han practicado dicha cooperación han sido reformuladas, impactando la forma en que los diversos agentes involucrados interactúan. En la primera parte de éste trabajo se busca caracterizar la naturaleza de la interacción entre agentes dentro del proceso de cooperación; para ello recurrimos a la Teoría de Juegos, en particular a los Juegos Cooperativos en su modalidad de Acuerdo; introduciendo el concepto de óptimo de Pareto y el postulado de eficiencia de Coase. La segunda parte de éste trabajo es dedicada al concepto de Desarrollo. Describimos su evolución -caracterizada por la ruptura de paradigmas-; exponemos dos enfoques: uno basado en el cómo y para quién y otro temporario que hace referencia al corto y largo plazo; resaltando que el enfoque actual es aquel centrado en los elementos humanos. Por otra parte, analizamos el rol que tiene la Ayuda Oficial al Desarrollo (AOD), desde un punto de vista político, permitiéndonos entrever los intereses implícitos de la misma en los Estados receptores. Finalmente, describimos los elementos críticos de la evolución de las relaciones y la cooperación para el desarrollo entre América latina y la Unión Europea, así como la relación de Colombia con ésta última. Adicionalmente, detallamos el importante rol que las Organizaciones No Gubernamentales (ONG) han tenido para el desarrollo de los proyectos generados dentro del marco de las relaciones de cooperación entre América Latina y la Unión Europea.
Resumo:
We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.
Resumo:
Patterns of the biosynthesis ofmajor metabolites of the oleaginous yeast Cryptococcus curvatus NRRL Y-1511 were investigated during cultivation on sugar-based media. When lactose or sucrose was employed as substrate under nitrogen-limited conditions, the yeast strain accumulated high quantities of intra-cellular total sugars (ITS) at the beginning of fermentation (up to 68% w/w), with ITS values progressively decreasing to 20%, w/w, at the end of the fermentation. Decrease in ITS content and consumption of extracellular lactose led to a subsequent rise in lipid accumulation, reaching 29.8% in dry cell weight at 80 g/L of initial lactose concentration. Lactose was a more favorable substrate for lipid production than sucrose. In nitrogen-excess conditions, ITS were produced in significant quantities despite the continuous presence of nitrogen in the medium. Growth on lactose was not followed by secretion of extra-cellular b-galactosidase. High quantities of extra-cellular invertase were observed during growth on sucrose. The composition of ITS was highly influenced by the sugar used as substrate. Cellular lipids contained mainly palmitic and to lesser extent linoleic and stearic acids. This is the first report in the literature that demonstrates the interplay between the biosynthesis of intra-cellular total sugars and lipid synthesis for oleaginous yeast strains.
Resumo:
Experimental results from the open literature have been employed for the design and techno-economic evaluation of four process flowsheets for the production of microbial oil or biodiesel. The fermentation of glucose-based media using the yeast strain Rhodosporidium toruloides has been considered. Biodiesel production was based on the exploitation of either direct transesterification (without extraction of lipids from microbial biomass) or indirect transesterifaction of extracted microbial oil. When glucose-based renewable resources are used as carbon source for an annual production capacity of 10,000 t microbial oil and zero cost of glucose (assuming development of integrated biorefineries in existing industries utilising waste or by-product streams) the estimated unitary cost of purified microbial oil is $3.4/kg. Biodiesel production via indirect transesterification of extracted microbial oil proved more cost-competitive process compared to the direct conversion of dried yeast cells. For a price of glucose of $400/t oil production cost and biodiesel production cost are estimated to be $5.5/kg oil and $5.9/kg biodiesel, correspondingly. Industrial implementation of microbial oil production from oleaginous yeast is strongly dependent on the feedstock used and on the fermentation stage where significantly higher productivities and final microbial oil concentrations should be achieved.
Resumo:
In spite of trying to understand processes in the same spatial domain, the catchment hydrology and water quality scientific communities are relatively disconnected and so are their respective models. This is emphasized by an inadequate representation of transport processes, in both catchment-scale hydrological and water quality models. While many hydrological models at the catchment scale only account for pressure propagation and not for mass transfer, catchment scale water quality models are typically limited by overly simplistic representations of flow processes. With the objective of raising awareness for this issue and outlining potential ways forward we provide a non-technical overview of (1) the importance of hydrology-controlled transport through catchment systems as the link between hydrology and water quality; (2) the limitations of current generation catchment-scale hydrological and water quality models; (3) the concept of transit times as tools to quantify transport and (4) the benefits of transit time based formulations of solute transport for catchment-scale hydrological and water quality models. There is emerging evidence that an explicit formulation of transport processes, based on the concept of transit times has the potential to improve the understanding of the integrated system dynamics of catchments and to provide a stronger link between catchment-scale hydrological and water quality models.
Resumo:
Kumaraswamy [Generalized probability density-function for double-bounded random-processes, J. Hydrol. 462 (1980), pp. 79-88] introduced a distribution for double-bounded random processes with hydrological applications. For the first time, based on this distribution, we describe a new family of generalized distributions (denoted with the prefix `Kw`) to extend the normal, Weibull, gamma, Gumbel, inverse Gaussian distributions, among several well-known distributions. Some special distributions in the new family such as the Kw-normal, Kw-Weibull, Kw-gamma, Kw-Gumbel and Kw-inverse Gaussian distribution are discussed. We express the ordinary moments of any Kw generalized distribution as linear functions of probability weighted moments (PWMs) of the parent distribution. We also obtain the ordinary moments of order statistics as functions of PWMs of the baseline distribution. We use the method of maximum likelihood to fit the distributions in the new class and illustrate the potentiality of the new model with an application to real data.
Resumo:
The modeling and analysis of lifetime data is an important aspect of statistical work in a wide variety of scientific and technological fields. Good (1953) introduced a probability distribution which is commonly used in the analysis of lifetime data. For the first time, based on this distribution, we propose the so-called exponentiated generalized inverse Gaussian distribution, which extends the exponentiated standard gamma distribution (Nadarajah and Kotz, 2006). Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters. The usefulness of the new model is illustrated by means of a real data set. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
Birnbaum and Saunders (1969a) introduced a probability distribution which is commonly used in reliability studies For the first time based on this distribution the so-called beta-Birnbaum-Saunders distribution is proposed for fatigue life modeling Various properties of the new model including expansions for the moments moment generating function mean deviations density function of the order statistics and their moments are derived We discuss maximum likelihood estimation of the model s parameters The superiority of the new model is illustrated by means of three failure real data sets (C) 2010 Elsevier B V All rights reserved
Resumo:
The Laplace distribution is one of the earliest distributions in probability theory. For the first time, based on this distribution, we propose the so-called beta Laplace distribution, which extends the Laplace distribution. Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters and derive the observed information matrix. The usefulness of the new model is illustrated by means of a real data set. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Until the early 90s, the simulation of fluid flow in oil reservoir basically used the numerical technique of finite differences. Since then, there was a big development in simulation technology based on streamlines, so that nowadays it is being used in several cases and it can represent the physical mechanisms that influence the fluid flow, such as compressibility, capillarity and gravitational segregation. Streamline-based flow simulation is a tool that can help enough in waterflood project management, because it provides important information not available through traditional simulation of finite differences and shows, in a direct way, the influence between injector well and producer well. This work presents the application of a methodology published in literature for optimizing water injection projects in modeling of a Brazilian Potiguar Basin reservoir that has a large number of wells. This methodology considers changes of injection well rates over time, based on information available through streamline simulation. This methodology reduces injection rates in wells of lower efficiency and increases injection rates in more efficient wells. In the proposed model, the methodology was effective. The optimized alternatives presented higher oil recovery associated with a lower water injection volume. This shows better efficiency and, consequently, reduction in costs. Considering the wide use of the water injection in oil fields, the positive outcome of the modeling is important, because it shows a case study of increasing of oil recovery achieved simply through better distribution of water injection rates
Resumo:
A produção de biomassa e de carotenoides por Rhodotorula rubra foi estudada em meios à base de caldo, melaço e xarope de cana. Avaliou-se o efeito da suplementação dos meios com nitrogênio na forma de ureia ou do nutriente comercial Nitrofos KL. O delineamento experimental utilizado foi o inteiramente casualizado, no esquema fatorial 3 × 3, sendo um dos fatores o substrato (caldo, melaço e xarope) e o outro a suplementação (controle, ureia e Nitrofos KL). Os resultados foram submetidos à análise de variância e teste de Tukey a 5% de probabilidade. As maiores produções de massa seca de levedura foram obtidas no meio à base de melaço suplementado com ureia ou Nitrofos KL (15,09 e 14,87 g/L, respectivamente). A produção de carotenoides intracelular foi elevada em todos os meios estudados sem suplementação (0,329 mg/g). Para a produção volumétrica, o melhor meio foi o melaço (2,74 mg/L), enquanto a suplementação com ureia e com Nitrofos KL produziu 2,55 e 2,32 mg/L, respectivamente. Os principais carotenoides produzidos foram toruleno, torularrodina e β-caroteno. No meio à base de caldo de cana sem suplementação, houve o menor consumo de carboidratos, enquanto que o meio com suplementação à base de ureia obteve o maior consumo.
Resumo:
The present work is based on the applied bilinear predictive control applied to an induction motor. As in particular case of the technique based on predictive control in nonlinem systems, these have desperted great interest, a time that present the advantage of being simpler than the non linear in general and most representative one than the linear one. One of the methods, adopted here, uses the linear model "quasi linear for step of time" based in Generalized Predictive Control. The modeling of the induction motor is made by the Vectorial control with orientation given for the indirect rotor. The system is formed by an induction motor of 3 cv with rotor in squirregate, set in motion for a group of benches of tests developed for this work, presented resulted for a variation of +5% in the value of set-point and for a variation of +10% and -10% in the value of the applied nominal load to the motor. The results prove a good efficiency of the predictive bilinear controllers, then compared with the linear cases