883 resultados para mathematical modelling
Resumo:
We investigate an application of the method of fundamental solutions (MFS) to the one-dimensional inverse Stefan problem for the heat equation by extending the MFS proposed in [5] for the one-dimensional direct Stefan problem. The sources are placed outside the space domain of interest and in the time interval (-T, T). Theoretical properties of the method, as well as numerical investigations, are included, showing that accurate and stable results can be obtained efficiently with small computational cost.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.
Resumo:
Genetic experiments over the last few decades have identified many regulatory proteins critical for DNA transcription. The dynamics of their transcriptional activities shape the differential expression of the genes they control. Here we describe a simple method, based on the secreted luciferase, to measure the activities of two transcription factors NF?B and HIF. This technique can effectively monitor dynamics of transcriptional events in a population of cells and be up-scaled for high-throughput screening and promoter analysis, making it ideal for data-demanding applications such as mathematical modelling.
Resumo:
Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.
Resumo:
Projects exposed to an uncertain environment must be adapted to deal with the effective integration of various planning elements and the optimization of project parameters. Time, cost, and quality are the prime objectives of a project that need to be optimized to fulfill the owner's goal. In an uncertain environment, there exist many other conflicting objectives that may also need to be optimized. These objectives are characterized by varying degrees of conflict. Moreover, an uncertain environment also causes several changes in the project plan throughout its life, demanding that the project plan be totally flexible. Goal programming (GP), a multiple criteria decision making technique, offers a good solution for this project planning problem. There the planning problem is considered from the owner's perspective, which leads to classifying the project up to the activity level. GP is applied separately at each level, and the formulated models are integrated through information flow. The flexibility and adaptability of the models lies in the ease of updating the model parameters at the required level through changing priorities and/or constraints and transmitting the information to other levels. The hierarchical model automatically provides integration among various element of planning. The proposed methodology is applied in this paper to plan a petroleum pipeline construction project, and its effectiveness is demonstrated.
Resumo:
This article presents the principal results of the doctoral thesis “Isomerism as internal symmetry of molecules” by Valentin Vankov Iliev (Institute of Mathematics and Informatics), successfully defended before the Specialised Academic Council for Informatics and Mathematical Modelling on 15 December, 2008.
Resumo:
This article presents the principal results of the doctoral thesis “Semantic-oriented Architecture and Models for Personalized and Adaptive Access to the Knowledge in Multimedia Digital Library” by Desislava Ivanova Paneva-Marinova (Institute of Mathematics and Informatics), successfully defended before the Specialised Academic Council for Informatics and Mathematical Modelling on 27 October, 2008.
Resumo:
This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case we can conduct a series of experiments of relatively low power and superpose the response signals. However, this method is conjugated with considerable loss of information (especially in the high frequency domain) due to fluctuations of the phase, the frequency and the starting time of each individual experiment. The preprocessing technique presented in this paper allows us to substantially restore the response of the medium and consequently to find a better estimate for the transmission function. This technique is based on expanding the initial signal into the system of orthogonal functions.
Resumo:
This article presents the principal results of the doctoral thesis “Direct Operational Methods in the Environment of a Computer Algebra System” by Margarita Spiridonova (Institute of mathematics and Informatics, BAS), successfully defended before the Specialised Academic Council for Informatics and Mathematical Modelling on 23 March, 2009.
Resumo:
This article presents the principal results of the doctoral thesis “Recognition of neume notation in historical documents” by Lasko Laskov (Institute of Mathematics and Informatics at Bulgarian Academy of Sciences), successfully defended before the Specialized Academic Council for Informatics and Mathematical Modelling on 07 June 2010.
Resumo:
AMS subject classification: 90B60, 90B50, 90A80.
Resumo:
This work is an initial study of a numerical method for identifying multiple leak zones in saturated unsteady flow. Using the conventional saturated groundwater flow equation, the leak identification problem is modelled as a Cauchy problem for the heat equation and the aim is to find the regions on the boundary of the solution domain where the solution vanishes, since leak zones correspond to null pressure values. This problem is ill-posed and to reconstruct the solution in a stable way, we therefore modify and employ an iterative regularizing method proposed in [1] and [2]. In this method, mixed well-posed problems obtained by changing the boundary conditions are solved for the heat operator as well as for its adjoint, to get a sequence of approximations to the original Cauchy problem. The mixed problems are solved using a Finite element method (FEM), and the numerical results indicate that the leak zones can be identified with the proposed method.
Resumo:
A dolgozat első részében röviden áttekintjük a 2007-ben kezdődött pénzügyi válság lefolyását és a válsághoz vezető okokat. A bemutatás során igyekszünk végig a mögöttes folyamatokra és azok mozgatórugóira koncentrálni, ezzel megragadva a válság egyfajta "elméletét". A bemutatásból láthatóvá válik a hitelderivatívák kiemelt szerepe a válság során. A dolgozat második részében az egyik legnépszerűbb hitelderivatív termék, a szintetikus fedezett adósságkötelezettségek (CDO-k) matematikai modellezését és annak problémáit mutatjuk be. Sokak szerint ezek a matematikai modellek okozták - vagy legalábbis felerősítették - a válságot. Az elemzés során megmutatjuk, hogy nemcsak a modellezési eszközök nem voltak megfelelők, hanem az árazás elve sem állta meg a helyét a kockázatsemleges árazási keretben. Ez az eredmény élesen rámutat a mögöttes elméletek válságára. / === / The first part of the paper examines briefly the financial crisis of 2007 and its causes, focusing on its driving processes and key motifs. This shows clearly the importance and centrality of credit derivatives in the crisis. The second part presents a mathematical modelling of one of the most popular credit derivative products: synthetic collateralized debt obligations, along with the drawbacks and problems of the modelling process. It is widely claimed that these products caused or at least precipitated the crises. The authors show not only that the modelling tools were inappropriate, but that the principle for pricing did not match adequately the risk-neutral valuation framework.
Resumo:
Acknowledgment The first two authors wish to express their sincerest thanks to Iran National Science Foundation (INSF) for supporting this work under Contract Number 92021291.