660 resultados para MATHEMATICAL PROGRAMS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new numerical program able to model syntectonic sedimentation. The new model combines a discrete element model of the tectonic deformation of a sedimentary cover and a process-based model of sedimentation in a single framework. The integration of these two methods allows us to include the simulation of both sedimentation and deformation processes in a single and more effective model. The paper describes briefly the antecedents of the program, Simsafadim-Clastic and a discrete element model, in order to introduce the methodology used to merge both programs to create the new code. To illustrate the operation and application of the program, analysis of the evolution of syntectonic geometries in an extensional environment and also associated with thrust fault propagation is undertaken. Using the new code, much more complex and realistic depositional structures can be simulated together with a more complex analysis of the evolution of the deformation within the sedimentary cover, which is seen to be affected by the presence of the new syntectonic sediments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this project is to get used to another kind of programming. Since now, I used very complex programming languages to develop applications or even to program microcontrollers, but PicoCricket system is the evidence that we don’t need so complex development tools to get functional devices. PicoCricket system is the clear example of simple programming to make devices work the way we programmed it. There’s an easy but effective way to programs mall devices just saying what we want them to do. We cannot do complex algorithms and mathematical operations but we can program them in a short time. Nowadays, the easier and faster we produce, the more we earn. So the tendency is to develop fast, cheap and easy, and PicoCricket system can do it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present thesis in focused on the minimization of experimental efforts for the prediction of pollutant propagation in rivers by mathematical modelling and knowledge re-use. Mathematical modelling is based on the well known advection-dispersion equation, while the knowledge re-use approach employs the methods of case based reasoning, graphical analysis and text mining. The thesis contribution to the pollutant transport research field consists of: (1) analytical and numerical models for pollutant transport prediction; (2) two novel techniques which enable the use of variable parameters along rivers in analytical models; (3) models for the estimation of pollutant transport characteristic parameters (velocity, dispersion coefficient and nutrient transformation rates) as functions of water flow, channel characteristics and/or seasonality; (4) the graphical analysis method to be used for the identification of pollution sources along rivers; (5) a case based reasoning tool for the identification of crucial information related to the pollutant transport modelling; (6) and the application of a software tool for the reuse of information during pollutants transport modelling research. These support tools are applicable in the water quality research field and in practice as well, as they can be involved in multiple activities. The models are capable of predicting pollutant propagation along rivers in case of both ordinary pollution and accidents. They can also be applied for other similar rivers in modelling of pollutant transport in rivers with low availability of experimental data concerning concentration. This is because models for parameter estimation developed in the present thesis enable the calculation of transport characteristic parameters as functions of river hydraulic parameters and/or seasonality. The similarity between rivers is assessed using case based reasoning tools, and additional necessary information can be identified by using the software for the information reuse. Such systems represent support for users and open up possibilities for new modelling methods, monitoring facilities and for better river water quality management tools. They are useful also for the estimation of environmental impact of possible technological changes and can be applied in the pre-design stage or/and in the practical use of processes as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Customer loyalty programs have increased in recent decades and customers are members in several loyalty programs. Loyalty programs are seen as a tool of exploiting customer relationship management with the aim of long-term, profitable customer relationships. Companies have created such a market where customers are rewarded automatically, even if they are disloyal. In addition, companies are struggling as they try to motivate customers to purchase more and make more frequent visits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a new mathematical equation correction approach for overcoming spectral and transport interferences was proposed. The proposal was applied to eliminate spectral interference caused by PO molecules at the 217.0005 nm Pb line, and the transport interference caused by variations in phosphoric acid concentrations. Correction may be necessary at 217.0005 nm to account for the contribution of PO, since Atotal217.0005 nm = A Pb217.0005 nm + A PO217.0005 nm. This may be easily done by measuring other PO wavelengths (e.g. 217.0458 nm) and calculating the relative contribution of PO absorbance (A PO) to the total absorbance (Atotal) at 217.0005 nm: A Pb217.0005 nm = Atotal217.0005 nm - A PO217.0005 nm = Atotal217.0005 nm - k (A PO217.0458 nm). The correction factor k is calculated from slopes of calibration curves built up for phosphorous (P) standard solutions measured at 217.0005 and 217.0458 nm, i.e. k = (slope217.0005 nm/slope217.0458 nm). For wavelength integrated absorbance of 3 pixels, sample aspiration rate of 5.0 ml min-1, analytical curves in the 0.1 - 1.0 mg L-1 Pb range with linearity better than 0.9990 were consistently obtained. Calibration curves for P at 217.0005 and 217.0458 nm with linearity better than 0.998 were obtained. Relative standard deviations (RSD) of measurements (n = 12) in the range of 1.4 - 4.3% and 2.0 - 6.0% without and with mathematical equation correction approach were obtained respectively. The limit of detection calculated to analytical line at 217.0005 nm was 10 µg L-1 Pb. Recoveries for Pb spikes were in the 97.5 - 100% and 105 - 230% intervals with and without mathematical equation correction approach, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preference relations, and their modeling, have played a crucial role in both social sciences and applied mathematics. A special category of preference relations is represented by cardinal preference relations, which are nothing other than relations which can also take into account the degree of relation. Preference relations play a pivotal role in most of multi criteria decision making methods and in the operational research. This thesis aims at showing some recent advances in their methodology. Actually, there are a number of open issues in this field and the contributions presented in this thesis can be grouped accordingly. The first issue regards the estimation of a weight vector given a preference relation. A new and efficient algorithm for estimating the priority vector of a reciprocal relation, i.e. a special type of preference relation, is going to be presented. The same section contains the proof that twenty methods already proposed in literature lead to unsatisfactory results as they employ a conflicting constraint in their optimization model. The second area of interest concerns consistency evaluation and it is possibly the kernel of the thesis. This thesis contains the proofs that some indices are equivalent and that therefore, some seemingly different formulae, end up leading to the very same result. Moreover, some numerical simulations are presented. The section ends with some consideration of a new method for fairly evaluating consistency. The third matter regards incomplete relations and how to estimate missing comparisons. This section reports a numerical study of the methods already proposed in literature and analyzes their behavior in different situations. The fourth, and last, topic, proposes a way to deal with group decision making by means of connecting preference relations with social network analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The feasibility of using augmented block designs and spatial analysis methods for early stage selection in eucalyptus breeding programs was tested. A total of 113 half-sib progenies of Eucalyptus urophylla and eight clones were evaluated in an 11 x 11 triple lattice experiment at two locations: Posto da Mata (Bahia, Brazil) and São Mateus (Minas Gerais, Brazil). Four checks were randomly allocated within each block. Plots consisted of 15 m long rows containing 6 plants spaced 3 m apart. The girth at breast height (cm/plant) was evaluated at 19 and 26 months of age. Variance analyses were performed according to the following methods: lattice design, randomized complete block design, augmented block design, Papadakis method, moving means method, and check plots. Comparisons among different methods were based on the magnitude of experimental errors and precision of the estimates of genetic and phenotypic parameters. General results indicated that augmented block design is useful to evaluate progenies and clones in early selection in eucalyptus breeding programs using moderate and low selection intensities. However, this design is not suitable for estimating genetic and phenotypic parameters due to its low precision. Check plots, nearest neighbour, Papadakis (1937), and moving means methods were efficient in removing the heterogeneity within blocks. These efficiencies were compared to that in lattice analysis for estimation of genetic and phenotypic parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study aimed to determine the volumetric shrinkage rate of bean (Phaseolus vulgaris L.) seeds during air-drying under different conditions of air, temperature and relative humidity, and to adjust several mathematical models to the empiric values observed, and select the one that best represents the phenomenon. Six mathematical models were adjusted to the experimental values to represent the phenomenon. It was determined the degree of adjustment of each model from the value of the coefficient of determination, the behavior of the distribution of the residuals, and the magnitude of the average relative and estimated errors. The rate of volumetric shrinkage that occurred in bean seeds during drying is between 25 and 37%. It basically depends on the final moisture content, regardless of the air conditions during drying. The Modified Bala & Woods' model best represented the process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In São Paulo State, mainly in rural areas, the utilization of wooden poles is observed for different purposes. In this context, wood in contact with the ground presents faster deterioration, which is generally associated to environmental factors and, especially to the presence of fungi and insects. With the use of mathematical models, the useful life of wooden structures can be predicted by obtaining "climatic indexes" to indicate, comparatively among the areas studied, which have more or less tendency to fungi and insects attacks. In this work, by using climatological data of several cities at São Paulo State, a simplified mathematical model was obtained to measure the aggressiveness of the wood in contact with the soil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rural electrification is characterized by geographical dispersion of the population, low consumption, high investment by consumers and high cost. Moreover, solar radiation constitutes an inexhaustible source of energy and in its conversion into electricity photovoltaic panels are used. In this study, equations were adjusted to field conditions presented by the manufacturer for current and power of small photovoltaic systems. The mathematical analysis was performed on the photovoltaic rural system I-100 from ISOFOTON, with power 300 Wp, located at the Experimental Farm Lageado of FCA/UNESP. For the development of such equations, the circuitry of photovoltaic cells has been studied to apply iterative numerical methods for the determination of electrical parameters and possible errors in the appropriate equations in the literature to reality. Therefore, a simulation of a photovoltaic panel was proposed through mathematical equations that were adjusted according to the data of local radiation. The results have presented equations that provide real answers to the user and may assist in the design of these systems, once calculated that the maximum power limit ensures a supply of energy generated. This real sizing helps establishing the possible applications of solar energy to the rural producer and informing the real possibilities of generating electricity from the sun.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to apply mathematical models to the growth of Nile tilapia (Oreochromis niloticus) reared in net cages in the lower São Francisco basin and choose the model(s) that best represents the conditions of rearing for the region. Nonlinear models of Brody, Bertalanffy, Logistic, Gompertz, and Richards were tested. The models were adjusted to the series of weight for age according to the methods of Gauss, Newton, Gradiente and Marquardt. It was used the procedure "NLIN" of the System SAS® (2003) to obtain estimates of the parameters from the available data. The best adjustment of the data were performed by the Bertalanffy, Gompertz and Logistic models which are equivalent to explain the growth of the animals up to 270 days of rearing. From the commercial point of view, it is recommended that commercialization of tilapia from at least 600 g, which is estimated in the Bertalanffy, Gompertz and Logistic models for creating over 183, 181 and 184 days, and up to 1 Kg of mass , it is suggested the suspension of the rearing up to 244, 244 and 243 days, respectively.