30 resultados para Antiseptic formulations
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum Capture Problem. In the original MCP, market capture is obtained by lower traveling distances or lower traveling time, in this new version not only the traveling time but also the waiting time will affect the market share. This problem is hard to solve using standard optimization techniques. Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
A new direction of research in Competitive Location theory incorporatestheories of Consumer Choice Behavior in its models. Following thisdirection, this paper studies the importance of consumer behavior withrespect to distance or transportation costs in the optimality oflocations obtained by traditional Competitive Location models. To dothis, it considers different ways of defining a key parameter in thebasic Maximum Capture model (MAXCAP). This parameter will reflectvarious ways of taking into account distance based on several ConsumerChoice Behavior theories. The optimal locations and the deviation indemand captured when the optimal locations of the other models are usedinstead of the true ones, are computed for each model. A metaheuristicbased on GRASP and Tabu search procedure is presented to solve all themodels. Computational experience and an application to 55-node networkare also presented.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
Currently a growing interest to improve the pharmacological therapy exists, not only by the production and the appearance of new drugs, but guaranteeing that the uses of those which already exist, become more effective. In fact, the conventional pharmaceutical formulations of different drugs present a few secondary effects due to oral administration. In order to avoid these undesired side effects, the purpose of current therapeutic is the development and research of formulations as an alternative to others routes of administration. Therefore, in spite of the undoubtedly complete parenteral absorption, the transdermal and transbuccal routes appear to be a rather attractive alternative to provide an efficient absorption. In this chapter a new technological, biopharmaceutical and pharmacokinetic approach of strategies for application on skin and buccal mucosa are reported. In the future new transdermal drug delivery systems will emerge to be more effective, equipped with an improved aesthetic appearance, better adherence and greater diffusion. But to reach these aims, it is necessary previous knowledge of histology and physiology of skin, and factors involved in the penetration of drugs through it.
Resumo:
Abstract Personalized medicine is a challenging research area in paediatric treatments. Elaborating new paediatric formulations when no commercial forms are available is a common practice in pharmacy laboratories; among these, oral liquid formulations are the most common. But due to the lack of specialized equipment, frequently studies to assure the efficiency and safety of the final medicine cannot be carried out. Thus the purpose of this work was the development, characterization and stability evaluation of two oral formulations of sildenafil for the treatment of neonatal persistent pulmonary hypertension. After the establishment of a standard operating procedure (SOP) and elaboration, the physicochemical stability parameters appearance, pH, particle size, rheological behaviour and drug content of formulations were evaluated at three different temperatures for 90 days. Equally, prediction of long term stability, as well as, microbiological stability was performed. Formulations resulted in a suspension and a solution slightly coloured exhibiting fruity odour. Formulation I (suspension) exhibited the best physicochemical properties including Newtonian behaviour and uniformity of API content above 90% to assure an exact dosification process.
Resumo:
Personalized medicine is a challenging research area in paediatric treatments. Elaborating new paediatric formulations when no commercial forms are available is a common practice in pharmacy laboratories; among these, oral liquid formulations are the most common. But due to the lack of specialized equipment, frequently studies to assure the efficiency and safety of the final medicine cannot be carried out. Thus the purpose of this work was the development, characterization and stability evaluation of two oral formulations of sildenafil for the treatment of neonatal persistent pulmonary hypertension. After the establishment of a standard operating procedure (SOP) and elaboration, the physicochemical stability parameters appearance, pH, particle size, rheological behaviour and drug content of formulations were evaluated at three different temperatures for 90 days. Equally, prediction of long term stability, as well as, microbiological stability was performed. Formulations resulted in a suspension and a solution slightly coloured exhibiting fruity odour. Formulation I (suspension) exhibited the best physicochemical properties including Newtonian behaviour and uniformity of API content above 90% to assure an exact dosification process.
Resumo:
Understanding nanomaterial interactions within cells is of increasing importance for assessing their toxicity and cellular transport. Here, we developed nanovesicles containing bioactive cationic lysine-based amphiphiles, and assessed whether these cationic compounds increase the likelihood of intracellular delivery and modulate toxicity. We found different cytotoxic responses among the formulations, depending on surfactant, cell line and endpoint assayed. The induction of mitochondrial dysfunction, oxidative stress and apoptosis were the general mechanisms underlying cytotoxicity. Fluorescence microscopy analysis demonstrated that nanovesicles were internalized by HeLa cells, and evidenced that their ability to release endocytosed materials into cell cytoplasm depends on the structural parameters of amphiphiles. The cationic charge position and hydrophobicity of surfactants determine the nanovesicle interactions within the cell and, thus, the resulting toxicity and intracellular behavior after cell uptake of the nanomaterial. The insights into some toxicity mechanisms of these new nanomaterials contribute to reducing the uncertainty surrounding their potential health hazards.
Resumo:
Liberalism claims that for a subject S to be justified in believing p, a proposition about the external world, on the basis of his senses it is not necessary to be antecedently justified in believing propositions as there is an external world. On the other hand, conservatism claims that to be justified in believing that p on the basis of one’s perception, one must have antecedent justification to believe that p. Intuitively, we are inclined to think that liberalism about the structure of perceptual justification fits better with our epistemic practices. We acknowledge that, although we cannot produce warrant for the background belief in the external world, our empirical beliefs can be perceptually justified. However, I am interested in arguing that conservatism is theoretically better supported than liberalism. The first reason to defend this is based on the fact that in embracing liberalism dogmatism is affected by pervasive problems. The second one comes from recognizing the strength of the argument based on the thesis that experience is theory-laden. But not all are advantages for conservatism. Conservatism is presupposed in contemporary formulations of scepticism through the requirement of prior justification for background assumptions, and this fact compels anti-sceptical conservatives to conceive a non-evidential form of warrant, entitlement, to contest the sceptical threat My main worry is that, although the path of entitlement has some prospects to succeed, this new notion of justification seems to be posed ad hoc for conservatives to solve the sceptical problem. These contents are organized along the three chapters. The result of chapter 1 is a pattern of sceptical argument formed by two premises: P1*, a conservative principle, and P2*. In chapter 2 and chapter 3 two anti-sceptical proposals against the argument sketched in chapter 1 are described. Chapter 2 is devoted to explain and assess a first anti-sceptical proposal that denies P1*: dogmatism. Moreover, in chapter 3, another anti-sceptical strategy is described (the route of entitlement) that contests scepticism denying the plausibility of P2*.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la Dublin Institute for Advanced Studies, Irlanda, entre setembre i desembre del 2009.En els últims anys s’ha realitzat un important avanç en la modelització tridimensional en magnetotel•lúrica (MT) gracies a l'augment d’algorismes d’inversió tridimensional disponibles. Aquests codis utilitzen diferents formulacions del problema (diferències finites, elements finits o equacions integrals), diverses orientacions del sistema de coordenades i, o bé en el conveni de signe, més o menys, en la dependència temporal. Tanmateix, les impedàncies resultants per a tots els valors d'aquests codis han de ser les mateixes una vegada que es converteixen a un conveni de signe comú i al mateix sistema de coordenades. Per comparar els resultats dels diferents codis hem dissenyat models diferents de resistivitats amb estructures tridimensional incrustades en un subsòl homogeni. Un requisit fonamental d’aquests models és que generin impedàncies amb valors importants en els elements de la diagonal, que no són menyspreables. A diferència dels casos del modelització de dades magnetotel.lúriques unidimensionals i bidimensionals, pel al cas tridimensional aquests elements de les diagonals del tensor d'impedància porten informació sobre l'estructura de la resistivitat. Un dels models de terreny s'utilitza per comparar els diferents algoritmes que és la base per posterior inversió dels diferents codis. Aquesta comparació va ser seguida de la inversió per recuperar el conjunt de dades d'una estructura coneguda.
Constraint algorithm for k-presymplectic Hamiltonian systems. Application to singular field theories
Resumo:
The k-symplectic formulation of field theories is especially simple, since only tangent and cotangent bundles are needed in its description. Its defining elements show a close relationship with those in the symplectic formulation of mechanics. It will be shown that this relationship also stands in the presymplectic case. In a natural way,one can mimick the presymplectic constraint algorithm to obtain a constraint algorithmthat can be applied to k-presymplectic field theory, and more particularly to the Lagrangian and Hamiltonian formulations offield theories defined by a singular Lagrangian, as well as to the unified Lagrangian-Hamiltonian formalism (Skinner--Rusk formalism) for k-presymplectic field theory. Two examples of application of the algorithm are also analyzed.
Resumo:
Two of the drawbacks of using natural-based composites in industrial applications are thermal instability and water uptake capacity. In this work, mechanical wood pulp was used to reinforce polypropylene at a level of 20 to 50 wt. %. Composites were mixed by means of a Brabender internal mixer for both non-coupled and coupled formulations. Differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA) were used to determine the thermal properties of the composites. The water uptake behavior was evaluated by immersion of the composites in water until an equilibrium state was reached. Results of water absorption tests revealed that the amount of water absorption was clearly dependent upon the fiber content. The coupled composites showed lower water absorption compared to the uncoupled composites. The incorporation of mechanical wood pulp into the polypropylene matrix produced a clear nucleating effect by increasing the crystallinity degree of the polymer and also increasing the temperature of polymer degradation. The maximum degradation temperature for stone ground wood pulp–reinforced composites was in the range of 330 to 345 ºC
Resumo:
The behavior of stone groundwood / polypropylene injection-molded composites was evaluated with and without coupling agent. Stone groundwood (SGW) is a fibrous material commonly prepared in a high yield process and mainly used for papermaking applications. In this work, the use of SGW fibers was explored as a reinforcing element of polypropylene (PP) composites. The surface charge density of the composite components was evaluated, as well as the fiber’s length and diameter inside the composite material. Two mixing extrusion processes were evaluated, and the use of a kinetic mixer, instead of an internal mixer, resulted in longer mean fiber lengths of the reinforcing fibers. On the other hand, the accessibility of surface hydroxyl groups of stone groundwood fibers was improved by treating the fibers with 5% of sodium hydroxide, resulting in a noticeable increase of the tensile strength of the composites, for a similar percentage of coupling agent. A new parameter called Fiber Tensile Strength Factor is defined and used as a baseline for the comparison of the properties of the different composite materials. Finally the competitiveness of stone groundwood / polypropylene / polypropylene-co-maleic anhydride system, which compared favorably to sized glass-fiber / polypropylene GF/PP and glass-fiber / polypropylene / polypropylene-co-maleic anhydride composite formulations, was quantified by means of the fiber tensile strength factor
Resumo:
The achievable region approach seeks solutions to stochastic optimisation problems by: (i) characterising the space of all possible performances(the achievable region) of the system of interest, and (ii) optimisingthe overall system-wide performance objective over this space. This isradically different from conventional formulations based on dynamicprogramming. The approach is explained with reference to a simpletwo-class queueing system. Powerful new methodologies due to the authorsand co-workers are deployed to analyse a general multiclass queueingsystem with parallel servers and then to develop an approach to optimalload distribution across a network of interconnected stations. Finally,the approach is used for the first time to analyse a class of intensitycontrol problems.
Resumo:
Revenue management practices often include overbooking capacity to account for customerswho make reservations but do not show up. In this paper, we consider the network revenuemanagement problem with no-shows and overbooking, where the show-up probabilities are specificto each product. No-show rates differ significantly by product (for instance, each itinerary andfare combination for an airline) as sale restrictions and the demand characteristics vary byproduct. However, models that consider no-show rates by each individual product are difficultto handle as the state-space in dynamic programming formulations (or the variable space inapproximations) increases significantly. In this paper, we propose a randomized linear program tojointly make the capacity control and overbooking decisions with product-specific no-shows. Weestablish that our formulation gives an upper bound on the optimal expected total profit andour upper bound is tighter than a deterministic linear programming upper bound that appearsin the existing literature. Furthermore, we show that our upper bound is asymptotically tightin a regime where the leg capacities and the expected demand is scaled linearly with the samerate. We also describe how the randomized linear program can be used to obtain a bid price controlpolicy. Computational experiments indicate that our approach is quite fast, able to scale to industrialproblems and can provide significant improvements over standard benchmarks.
Resumo:
Models incorporating more realistic models of customer behavior, as customers choosing from an offerset, have recently become popular in assortment optimization and revenue management. The dynamicprogram for these models is intractable and approximated by a deterministic linear program called theCDLP which has an exponential number of columns. When there are products that are being consideredfor purchase by more than one customer segment, CDLP is difficult to solve since column generationis known to be NP-hard. However, recent research indicates that a formulation based on segments withcuts imposing consistency (SDCP+) is tractable and approximates the CDLP value very closely. In thispaper we investigate the structure of the consideration sets that make the two formulations exactly equal.We show that if the segment consideration sets follow a tree structure, CDLP = SDCP+. We give acounterexample to show that cycles can induce a gap between the CDLP and the SDCP+ relaxation.We derive two classes of valid inequalities called flow and synchronization inequalities to further improve(SDCP+), based on cycles in the consideration set structure. We give a numeric study showing theperformance of these cycle-based cuts.