980 resultados para 280406 Mathematical Software
Resumo:
Las prácticas en laboratorios forman una parte muy importante de la formación en todos los programas docentes. A pesar de esta importancia, la creación de un laboratorio no es una tarea fácil, ya que el hecho de equipar un laboratorio puede suponer un gran gasto económico, tanto inicial como posterior. Como solución, surge la educación a distancia, y en concreto los laboratorios virtuales, es decir, simulaciones de un laboratorio real utilizando modelos matemáticos. Por sus características y flexibilidad se han ido desarrollando laboratorios virtuales en el ámbito docente, pero no todas las áreas cuentan con tantas posibilidades o facilidades como en la electrónica. La mayoría de los laboratorios accesibles desde Internet que hay en la actualidad dentro de la enseñanza a distancia o formación online, son virtuales. El laboratorio que se ha desarrollado tiene como principal ventaja la realización de prácticas controlando instrumentos y circuitos reales de forma remota. El proyecto consiste en realizar un sistema software para implementar un laboratorio remoto en el área de la electrónica analógica, que pueda ser utilizado como complemento a las actividades formativas que se realizan en los laboratorios de los centros de enseñanza. El sistema completo también consta de un hardware controlado mediante buses de comunicación estándar, que permite la implementación de distintos circuitos analógicos, de tal forma que se pueda realizar prácticas sobre circuitos físicos reales. Para desarrollar un laboratorio lo más real posible, la aplicación que maneja el estudiante es un visor 3D. Con la utilización de un visor 3D lo que se pretende es tener un aumento de la realidad a la hora de realizar las prácticas de laboratorio remotamente. El sistema desarrollado cuenta con un sistema de comunicación basado en un modelo cliente-servidor: • Servidor: se encarga de procesar las acciones que realiza el cliente y controla y monitoriza los instrumentos y dispositivos del sistema hardware. • Cliente: sería el usuario final, que mediante un visor 3D comunica las acciones a realizar al servidor para que éste las procese. Practices in laboratories are a very important part of training in all educational programs. Despite this importance, the establishment of a laboratory is not an easy task, since the fact of equipping a laboratory can be a great economic budget, both initial and subsequent spending. As a solution, appears the education at distance (online), and in particular the virtual labs, namely simulations of a real laboratory by using mathematical models. Virtual laboratories in the field of teaching have been developed for its features and flexibility, but not all areas have so many possibilities or facilities as in electronics. The most accessible laboratories from the Internet that are currently accessible within the distance or e-learning (on-line) are virtual. The laboratory which has been developed has as a main advantage to make practices or exercises in the fact of controlling instruments and real circuits remotely. The project consists of making a software system in order to implement a remote laboratory in the area of analog electronics that can be used as a complement to the others training activities to be carried out. The complete system also consists of a controlled hardware by standard communication buses that allow the implementation of several analog circuits, in such a way that practices can control real physical circuits. To develop a laboratory as more realistic as possible, the application that manages the student is a 3D viewer. With the use of a 3D viewer, is intended to have an increase in reality when any student wants to access to laboratory practices remotely. The developed system has a communication system based on a model Client/Server: • Server: The system that handles actions provided by the client and controls and monitors the instruments and devices in the hardware system. • Client: The end user, which using a 3D viewer, communicates the actions to be performed at the server so that it will process them.
Resumo:
The analysis of the interdependence between time series has become an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, and the introduction of concepts such as Generalized (GS) and Phase synchronization (PS). This increase in the number of approaches to tackle the existence of the so-called functional (FC) and effective connectivity (EC) (Friston 1994) between two, (or among many) neural networks, along with their mathematical complexity, makes it desirable to arrange them into a unified toolbox, thereby allowing neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of them.
Resumo:
Costs and environmental impacts are key elements in forest logistics and they must be integrated in forest decision-making. The evaluation of transportation fuel costs and carbon emissions depend on spatial and non-spatial data but in many cases the former type of data are dicult to obtain. On the other hand, the availability of software tools to evaluate transportation fuel consumption as well as costs and emissions of carbon dioxide is limited. We developed a software tool that combines two empirically validated models of truck transportation using Digital Elevation Model (DEM) data and an open spatial data tool, specically OpenStreetMap©. The tool generates tabular data and spatial outputs (maps) with information regarding fuel consumption, cost and CO2 emissions for four types of trucks. It also generates maps of the distribution of transport performance indicators (relation between beeline and real road distances). These outputs can be easily included in forest decision-making support systems. Finally, in this work we applied the tool in a particular case of forest logistics in north-eastern Portugal
Resumo:
Our extensive research has indicated that high-school teachers are reluctant to make use of existing instructional educational software (Pollard, 2005). Even software developed in a partnership between a teacher and a software engineer is unlikely to be adopted by teachers outside the partnership (Pollard, 2005). In this paper we address these issues directly by adopting a reusable architectural design for instructional educational software which allows easy customisation of software to meet the specific needs of individual teachers. By doing this we will facilitate more teachers regularly using instructional technology within their classrooms. Our domain-specific software architecture, Interface-Activities-Model, was designed specifically to facilitate individual customisation by redefining and restructuring what constitutes an object so that they can be readily reused or extended as required. The key to this architecture is the way in which the software is broken into small generic encapsulated components with minimal domain specific behaviour. The domain specific behaviour is decoupled from the interface and encapsulated in objects which relate to the instructional material through tasks and activities. The domain model is also broken into two distinct models - Application State Model and Domainspecific Data Model. This decoupling and distribution of control gives the software designer enormous flexibility in modifying components without affecting other sections of the design. This paper sets the context of this architecture, describes it in detail, and applies it to an actual application developed to teach high-school mathematical concepts.
Resumo:
This work reports the developnent of a mathenatical model and distributed, multi variable computer-control for a pilot plant double-effect climbing-film evaporator. A distributed-parameter model of the plant has been developed and the time-domain model transformed into the Laplace domain. The model has been further transformed into an integral domain conforming to an algebraic ring of polynomials, to eliminate the transcendental terms which arise in the Laplace domain due to the distributed nature of the plant model. This has made possible the application of linear control theories to a set of linear-partial differential equations. The models obtained have well tracked the experimental results of the plant. A distributed-computer network has been interfaced with the plant to implement digital controllers in a hierarchical structure. A modern rnultivariable Wiener-Hopf controller has been applled to the plant model. The application has revealed a limitation condition that the plant matrix should be positive-definite along the infinite frequency axis. A new multi variable control theory has emerged fram this study, which avoids the above limitation. The controller has the structure of the modern Wiener-Hopf controller, but with a unique feature enabling a designer to specify the closed-loop poles in advance and to shape the sensitivity matrix as required. In this way, the method treats directly the interaction problems found in the chemical processes with good tracking and regulation performances. Though the ability of the analytical design methods to determine once and for all whether a given set of specifications can be met is one of its chief advantages over the conventional trial-and-error design procedures. However, one disadvantage that offsets to some degree the enormous advantages is the relatively complicated algebra that must be employed in working out all but the simplest problem. Mathematical algorithms and computer software have been developed to treat some of the mathematical operations defined over the integral domain, such as matrix fraction description, spectral factorization, the Bezout identity, and the general manipulation of polynomial matrices. Hence, the design problems of Wiener-Hopf type of controllers and other similar algebraic design methods can be easily solved.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
The complex of questions connected with the analysis, estimation and structural-parametrical optimization of dynamic system is considered in this article. Connection of such problems with tasks of control by beams of trajectories is emphasized. The special attention is concentrated on the review and analysis of spent scientific researches, the attention is stressed to their constructability and applied directedness. Efficiency of the developed algorithmic and software is demonstrated on the tasks of modeling and optimization of output beam characteristics in linear resonance accelerators.
Resumo:
For metal and metal halide vapor lasers excited by high frequency pulsed discharge, the thermal effect mainly caused by the radial temperature distribution is of considerable importance for stable laser operation and improvement of laser output characteristics. A short survey of the obtained analytical and numerical-analytical mathematical models of the temperature profile in a high-powered He-SrBr2 laser is presented. The models are described by the steady-state heat conduction equation with mixed type nonlinear boundary conditions for the arbitrary form of the volume power density. A complete model of radial heat flow between the two tubes is established for precise calculating the inner wall temperature. The models are applied for simulating temperature profiles for newly designed laser. The author’s software prototype LasSim is used for carrying out the mathematical models and simulations.
Resumo:
Report published in the Proceedings of the National Conference on "Education in the Information Society", Plovdiv, May, 2012
Resumo:
The paper presents in brief the Bulgarian Digital Mathematical Library BulDML and the Czech Digital Mathematical Library DML-CZ. Both libraries use the open source software DSpace and both are partners in the European Digital Mathematics Library EuDML. We describe their content and metadata schemas; outline the architecture system and overview the statistics of its use.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.