910 resultados para mathematical modelling, controlled drug release, polymer micro-spheres, swelling, boundary problems
Resumo:
In this paper, we propose three novel mathematical models for the two-stage lot-sizing and scheduling problems present in many process industries. The problem shares a continuous or quasi-continuous production feature upstream and a discrete manufacturing feature downstream, which must be synchronized. Different time-based scale representations are discussed. The first formulation encompasses a discrete-time representation. The second one is a hybrid continuous-discrete model. The last formulation is based on a continuous-time model representation. Computational tests with state-of-the-art MIP solver show that the discrete-time representation provides better feasible solutions in short running time. On the other hand, the hybrid model achieves better solutions for longer computational times and was able to prove optimality more often. The continuous-type model is the most flexible of the three for incorporating additional operational requirements, at a cost of having the worst computational performance. Journal of the Operational Research Society (2012) 63, 1613-1630. doi:10.1057/jors.2011.159 published online 7 March 2012
Resumo:
The thesis consists of three independent parts. Part I: Polynomial amoebas We study the amoeba of a polynomial, as de ned by Gelfand, Kapranov and Zelevinsky. A central role in the treatment is played by a certain convex function which is linear in each complement component of the amoeba, which we call the Ronkin function. This function is used in two di erent ways. First, we use it to construct a polyhedral complex, which we call a spine, approximating the amoeba. Second, the Monge-Ampere measure of the Ronkin function has interesting properties which we explore. This measure can be used to derive an upper bound on the area of an amoeba in two dimensions. We also obtain results on the number of complement components of an amoeba, and consider possible extensions of the theory to varieties of codimension higher than 1. Part II: Differential equations in the complex plane We consider polynomials in one complex variable arising as eigenfunctions of certain differential operators, and obtain results on the distribution of their zeros. We show that in the limit when the degree of the polynomial approaches innity, its zeros are distributed according to a certain probability measure. This measure has its support on the union of nitely many curve segments, and can be characterized by a simple condition on its Cauchy transform. Part III: Radon transforms and tomography This part is concerned with different weighted Radon transforms in two dimensions, in particular the problem of inverting such transforms. We obtain stability results of this inverse problem for rather general classes of weights, including weights of attenuation type with data acquisition limited to a 180 degrees range of angles. We also derive an inversion formula for the exponential Radon transform, with the same restriction on the angle.
Resumo:
Being basic ingredients of numerous daily-life products with significant industrial importance as well as basic building blocks for biomaterials, charged hydrogels continue to pose a series of unanswered challenges for scientists even after decades of practical applications and intensive research efforts. Despite a rather simple internal structure it is mainly the unique combination of short- and long-range forces which render scientific investigations of their characteristic properties to be quite difficult. Hence early on computer simulations were used to link analytical theory and empirical experiments, bridging the gap between the simplifying assumptions of the models and the complexity of real world measurements. Due to the immense numerical effort, even for high performance supercomputers, system sizes and time scales were rather restricted until recently, whereas it only now has become possible to also simulate a network of charged macromolecules. This is the topic of the presented thesis which investigates one of the fundamental and at the same time highly fascinating phenomenon of polymer research: The swelling behaviour of polyelectrolyte networks. For this an extensible simulation package for the research on soft matter systems, ESPResSo for short, was created which puts a particular emphasis on mesoscopic bead-spring-models of complex systems. Highly efficient algorithms and a consistent parallelization reduced the necessary computation time for solving equations of motion even in case of long-ranged electrostatics and large number of particles, allowing to tackle even expensive calculations and applications. Nevertheless, the program has a modular and simple structure, enabling a continuous process of adding new potentials, interactions, degrees of freedom, ensembles, and integrators, while staying easily accessible for newcomers due to a Tcl-script steering level controlling the C-implemented simulation core. Numerous analysis routines provide means to investigate system properties and observables on-the-fly. Even though analytical theories agreed on the modeling of networks in the past years, our numerical MD-simulations show that even in case of simple model systems fundamental theoretical assumptions no longer apply except for a small parameter regime, prohibiting correct predictions of observables. Applying a "microscopic" analysis of the isolated contributions of individual system components, one of the particular strengths of computer simulations, it was then possible to describe the behaviour of charged polymer networks at swelling equilibrium in good solvent and close to the Theta-point by introducing appropriate model modifications. This became possible by enhancing known simple scaling arguments with components deemed crucial in our detailed study, through which a generalized model could be constructed. Herewith an agreement of the final system volume of swollen polyelectrolyte gels with results of computer simulations could be shown successfully over the entire investigated range of parameters, for different network sizes, charge fractions, and interaction strengths. In addition, the "cell under tension" was presented as a self-regulating approach for predicting the amount of swelling based on the used system parameters only. Without the need for measured observables as input, minimizing the free energy alone already allows to determine the the equilibrium behaviour. In poor solvent the shape of the network chains changes considerably, as now their hydrophobicity counteracts the repulsion of like-wise charged monomers and pursues collapsing the polyelectrolytes. Depending on the chosen parameters a fragile balance emerges, giving rise to fascinating geometrical structures such as the so-called pear-necklaces. This behaviour, known from single chain polyelectrolytes under similar environmental conditions and also theoretically predicted, could be detected for the first time for networks as well. An analysis of the total structure factors confirmed first evidences for the existence of such structures found in experimental results.
Resumo:
A set of problems concerning the behaviour of a suddenly disturbed ideal floating zone is considered. Mathematical techniques of asymptotic expansions arc used to solve these problems. It is seen that many already available solutions, most of them concerning liquids enclosed in cavities, will be regarded as starting approximations which are valid except in the proximity of the free surface which laterally bounds the floating zone. In particular, the problem of the linear spin-up of an initially cylindrical floating zone is considered in some detail. The presence of a recircuiating fluid pattern near the free surface is detected. This configuration is attributed to the interplay between Coriolis forces and the azimuthal component of the viscous forces.
Resumo:
Problem-based learning has been applied over the last three decades to a diverse range of learning environments. In this educational approach, different problems are posed to the learners so that they can develop different solutions while learning about the problem domain. When applied to conceptual modelling, and particularly to Qualitative Reasoning, the solutions to problems are models that represent the behaviour of a dynamic system. The learner?s task then is to bridge the gap between their initial model, as their first attempt to represent the system, and the target models that provide solutions to that problem. We propose the use of semantic technologies and resources to help in bridging that gap by providing links to terminology and formal definitions, and matching techniques to allow learners to benefit from existing models.
Resumo:
En los diseños y desarrollos de ingeniería, antes de comenzar la construcción e implementación de los objetivos de un proyecto, es necesario realizar una serie de análisis previos y simulaciones que corroboren las expectativas de la hipótesis inicial, con el fin de obtener una referencia empírica que satisfaga las condiciones de trabajo o funcionamiento de los objetivos de dicho proyecto. A menudo, los resultados que satisfacen las características deseadas se obtienen mediante la iteración de métodos de ensayo y error. Generalmente, éstos métodos utilizan el mismo procedimiento de análisis con la variación de una serie de parámetros que permiten adaptar una tecnología a la finalidad deseada. Hoy en día se dispone de computadoras potentes, así como algoritmos de resolución matemática que permiten resolver de forma veloz y eficiente diferentes tipos de problemas de cálculo. Resulta interesante el desarrollo de aplicaciones que permiten la resolución de éstos problemas de forma rápida y precisa en el análisis y síntesis de soluciones de ingeniería, especialmente cuando se tratan expresiones similares con variaciones de constantes, dado que se pueden desarrollar instrucciones de resolución con la capacidad de inserción de parámetros que definan el problema. Además, mediante la implementación de un código de acuerdo a la base teórica de una tecnología, se puede lograr un código válido para el estudio de cualquier problema relacionado con dicha tecnología. El desarrollo del presente proyecto pretende implementar la primera fase del simulador de dispositivos ópticos Slabsim, en cual se puede representar la distribución de la energía de una onda electromagnética en frecuencias ópticas guiada a través de una una guía dieléctrica plana, también conocida como slab. Este simulador esta constituido por una interfaz gráfica generada con el entorno de desarrollo de interfaces gráficas de usuario Matlab GUIDE, propiedad de Mathworks©, de forma que su manejo resulte sencillo e intuitivo para la ejecución de simulaciones con un bajo conocimiento de la base teórica de este tipo de estructuras por parte del usuario. De este modo se logra que el ingeniero requiera menor intervalo de tiempo para encontrar una solución que satisfaga los requisitos de un proyecto relacionado con las guías dieléctricas planas, e incluso utilizarlo para una amplia diversidad de objetivos basados en esta tecnología. Uno de los principales objetivos de este proyecto es la resolución de la base teórica de las guías slab a partir de métodos numéricos computacionales, cuyos procedimientos son extrapolables a otros problemas matemáticos y ofrecen al autor una contundente base conceptual de los mismos. Por este motivo, las resoluciones de las ecuaciones diferenciales y características que constituyen los problemas de este tipo de estructuras se realizan por estos medios de cálculo en el núcleo de la aplicación, dado que en algunos casos, no existe la alternativa de uso de expresiones analíticas útiles. ABSTRACT. The first step in engineering design and development is an analysis and simulation process which will successfully corroborate the initial hypothesis that was made and find solutions for a particular. In this way, it is possible to obtain empirical evidence which suitably substantiate the purposes of the project. Commonly, the characteristics to reach a particular target are found through iterative trial and error methods. These kinds of methods are based on the same theoretical analysis but with a variation of some parameters, with the objective to adapt the results for a particular aim. At present, powerful computers and mathematical algorithms are available to solve different kinds of calculation problems in a fast and efficient way. Computing application development is useful as it gives a high level of accurate results for engineering analysis and synthesis in short periods of time. This is more notable in cases where the mathematical expressions on a theoretical base are similar but with small variations of constant values. This is due to the ease of adaptation of the computer programming code into a parameter request system that defines a particular solution on each execution. Additionally, it is possible to code an application suitable to simulate any issue related to the studied technology. The aim of the present project consists of the construction of the first stage of an optoelectronics simulator named Slabsim. Slabism is capable of representing the energetic distribution of a light wave guided in the volume of a slab waveguide. The mentioned simulator is made through the graphic user interface development environment Matlab GUIDE, property of Mathworks©. It is designed for an easy and intuitive management by the user to execute simulations with a low knowledge of the technology theoretical bases. With this software it is possible to achieve several aims related to the slab waveguides by the user in low interval of time. One of the main purposes of this project is the mathematical solving of theoretical bases of slab structures through computing numerical analysis. This is due to the capability of adapting its criterion to other mathematical issues and provides a strong knowledge of its process. Based on these advantages, numerical solving methods are used in the core of the simulator to obtain differential and characteristic equations results that become represented on it.
Resumo:
Let f : [0, 1] x R2 -> R be a function satisfying the Caxatheodory conditions and t(1 - t)e(t) epsilon L-1 (0, 1). Let a(i) epsilon R and xi(i) (0, 1) for i = 1,..., m - 2 where 0 < xi(1) < xi(2) < (...) < xi(m-2) < 1 - In this paper we study the existence of C[0, 1] solutions for the m-point boundary value problem [GRAPHICS] The proof of our main result is based on the Leray-Schauder continuation theorem.
Resumo:
We consider the boundary value problems for nonlinear second-order differential equations of the form u '' + a(t)f (u) = 0, 0 < t < 1, u(0) = u (1) = 0. We give conditions on the ratio f (s)/s at infinity and zero that guarantee the existence of solutions with prescribed nodal properties. Then we establish existence and multiplicity results for nodal solutions to the problem. The proofs of our main results are based upon bifurcation techniques. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
We consider boundary value problems for nonlinear second order differential equations of the form u + a(t) f(u) = 0, t epsilon (0, 1), u(0) = u(1) = 0, where a epsilon C([0, 1], (0, infinity)) and f : R --> R is continuous and satisfies f (s)s > 0 for s not equal 0. We establish existence and multiplicity results for nodal solutions to the problems if either f(0) = 0, f(infinity) = infinity or f(0) = infinity, f(0) = 0, where f (s)/s approaches f(0) and f(infinity) as s approaches 0 and infinity, respectively. We use bifurcation techniques to prove our main results. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Drugs to treat inflammation are discussed under the following headings: (1) random discoveries covering copper, salicylates, heterocyclic diones, ACTH, adrenal steroids and disease-modifying agents (DMARDs); these include Au(I)-thiolates, chloroquine, and hydroxychloroquine, minocycline, cyclosporin, salazopyrine, D-penicillamine and methotrexate; (2) programmed NSAID developments covering salicylates and fenamates, arylalkanoates, diones, non-acidic NSAIDs, clozic, lobenzarit and coxibs; (3) synthetic glucocorticosteroids; and (4) 'Biologicals' for neutralising pro-inflammatory cytokines. Clinical problems are highlighted, particularly unacceptable side-effects affecting the GI tract, skin, liver, etc. that caused many drugs to be withdrawn. Drug combinations may overcome some of these problems. The bibliography has selected reviews and monographs covering 50 years of publications.
Resumo:
The main focus of this paper is on mathematical theory and methods which have a direct bearing on problems involving multiscale phenomena. Modern technology is refining measurement and data collection to spatio-temporal scales on which observed geophysical phenomena are displayed as intrinsically highly variable and intermittant heirarchical structures,e.g. rainfall, turbulence, etc. The heirarchical structure is reflected in the occurence of a natural separation of scales which collectively manifest at some basic unit scale. Thus proper data analysis and inference require a mathematical framework which couples the variability over multiple decades of scale in which basic theoretical benchmarks can be identified and calculated. This continues the main theme of the research in this area of applied probability over the past twenty years.
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
Background: Mood and anxiety disorders, and problems with self harm are significant and serious issues that are common in young people in the Criminal Justice System. Aims: To examine whether interventions relevant to young offenders with mood or anxiety disorders, or problems with self harm are effective. Method: Systematic review and meta-analysis of data from randomised controlled trials relevant to young offenders experiencing these problems. Results: An exhaustive search of the worldwide literature (published and unpublished)yielded 10 studies suitable for inclusion in this review. Meta-analysis of data from three studies (with a total population of 171 individuals) revealed that group-based Cognitive Behaviour Therapy (CBT) may help to reduce symptoms of depression in young offenders. Conclusions: These preliminary findings suggest that group-based CBT may be useful for young offenders with such mental health problems, but larger high quality RCTs are now needed to bolster the evidence-base.
Resumo:
We developed a nanoparticles (NPs) library from poly(ethylene glycol)–poly lactic acid comb-like polymers with variable amount of PEG. Curcumin was encapsulated in the NPs with a view to develop a delivery platform to treat diseases involving oxidative stress affecting the CNS. We observed a sharp decrease in size between 15 and 20% w/w of PEG which corresponds to a transition from a large solid particle structure to a “micelle-like” or “polymer nano-aggregate” structure. Drug loading, loading efficacy and release kinetics were determined. The diffusion coefficients of curcumin in NPs were determined using a mathematical modeling. The higher diffusion was observed for solid particles compared to “polymer nano-aggregate” particles. NPs did not present any significant toxicity when tested in vitro on a neuronal cell line. Moreover, the ability of NPs carrying curcumin to prevent oxidative stress was evidenced and linked to polymer architecture and NPs organization. Our study showed the intimate relationship between the polymer architecture and the biophysical properties of the resulting NPs and sheds light on new approaches to design efficient NP-based drug carriers.