899 resultados para Interval arithmetic


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis provides efficient and robust algorithms for the computation of the intersection curve between a torus and a simple surface (e.g. a plane, a natural quadric or another torus), based on algebraic and numeric methods. The algebraic part includes the classification of the topological type of the intersection curve and the detection of degenerate situations like embedded conic sections and singularities. Moreover, reference points for each connected intersection curve component are determined. The required computations are realised efficiently by solving quartic polynomials at most and exactly by using exact arithmetic. The numeric part includes algorithms for the tracing of each intersection curve component, starting from the previously computed reference points. Using interval arithmetic, accidental incorrectness like jumping between branches or the skipping of parts are prevented. Furthermore, the environments of singularities are correctly treated. Our algorithms are complete in the sense that any kind of input can be handled including degenerate and singular configurations. They are verified, since the results are topologically correct and approximate the real intersection curve up to any arbitrary given error bound. The algorithms are robust, since no human intervention is required and they are efficient in the way that the treatment of algebraic equations of high degree is avoided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we confirm, with absolute certainty, a conjecture on a certain oscillatory behaviour of higher auto-ionizing resonances of atoms and molecules beyond a threshold. These results not only definitely settle a more than 30 year old controversy in Rittby et al. (1981 Phys. Rev. A 24, 1636–1639 (doi:10.1103/PhysRevA.24.1636)) and Korsch et al. (1982 Phys. Rev. A 26, 1802–1803 (doi:10.1103/PhysRevA.26.1802)), but also provide new and reliable information on the threshold. Our interval-arithmetic-based method allows one, for the first time, to enclose and to exclude resonances with guaranteed certainty. The efficiency of our approach is demonstrated by the fact that we are able to show that the approximations in Rittby et al. (1981 Phys. Rev. A 24, 1636–1639 (doi:10.1103/PhysRevA.24.1636)) do lie near true resonances, whereas the approximations of higher resonances in Korsch et al. (1982 Phys. Rev. A 26, 1802–1803 (doi:10.1103/PhysRevA.26.1802)) do not, and further that there exist two new pairs of resonances as suggested in Abramov et al. (2001 J. Phys. A 34, 57–72 (doi:10.1088/0305-4470/34/1/304)).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this thesis we present an approach to automated verification of floating point programs. Existing techniques for automated generation of correctness theorems are extended to produce proof obligations for accuracy guarantees and absence of floating point exceptions. A prototype automated real number theorem prover is presented, demonstrating a novel application of function interval arithmetic in the context of subdivision-based numerical theorem proving. The prototype is tested on correctness theorems for two simple yet nontrivial programs, proving exception freedom and tight accuracy guarantees automatically. The prover demonstrates a novel application of function interval arithmetic in the context of subdivision-based numerical theorem proving. The experiments show how function intervals can be used to combat the information loss problems that limit the applicability of traditional interval arithmetic in the context of hard real number theorem proving.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work reports on a new software for solving linear systems involving affine-linear dependencies between complex-valued interval parameters. We discuss the implementation of a parametric residual iteration for linear interval systems by advanced communication between the system Mathematica and the library C-XSC supporting rigorous complex interval arithmetic. An example of AC electrical circuit illustrates the use of the presented software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exist uniquely ergodic affine interval exchange transformations of [0,1] with flips which have wandering intervals and are such that the support of the invariant measure is a Cantor set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we have used fluorescence spectroscopy to determine the post-mortem interval. Conventional methods in forensic medicine involve tissue or body fluids sampling and laboratory tests, which are often time demanding and may depend on expensive analysis. The presented method consists in using time-dependent variations on the fluorescence spectrum and its correlation with the time elapsed after regular metabolic activity cessation. This new approach addresses unmet needs for post-mortem interval determination in forensic medicine, by providing rapid and in situ measurements that shows improved time resolution relative to existing methods. (C) 2009 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the physiological adaptations that occur following endurance training in previously sedentary and recreationally active individuals are relatively well understood, the adaptations to training in already highly trained endurance athletes remain unclear. While significant improvements in endurance performance and corresponding physiological markers are evident following submaximal endurance training in sedentary and recreationally active groups, an additional increase in submaximal training (i.e. volume) in highly trained individuals does not appear to further enhance either endurance performance or associated physiological variables [e.g. peak oxygen uptake (V-dot O2peak), oxidative enzyme activity]. It seems that, for athletes who are already trained, improvements in endurance performance can be achieved only through high-intensity interval training (HIT). The limited research which has examined changes in muscle enzyme activity in highly trained athletes, following HIT, has revealed no change in oxidative or glycolytic enzyme activity, despite significant improvements in endurance performance (p < 0.05). Instead, an increase in skeletal muscle buffering capacity may be one mechanism responsible for an improvement in endurance performance. Changes in plasma volume, stroke volume, as well as muscle cation pumps, myoglobin, capillary density and fibre type characteristics have yet to be investigated in response to HIT with the highly trained athlete. Information relating to HIT programme optimisation in endurance athletes is also very sparse. Preliminary work using the velocity at which V-dot O2max is achieved (Vmax) as the interval intensity, and fractions (50 to 75%) of the time to exhaustion at Vmax (Tmax) as the interval duration has been successful in eliciting improvements in performance in long-distance runners. However, Vmax and Tmax have not been used with cyclists. Instead, HIT programme optimisation research in cyclists has revealed that repeated supramaximal sprinting may be equally effective as more traditional HIT programmes for eliciting improvements in endurance performance. Further examination of the biochemical and physiological adaptations which accompany different HIT programmes, as well as investigation into the optimal HIT programme for eliciting performance enhancements in highly trained athletes is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To determine the incidence of interval cancers which occurred in the first 12 months after mammographic screening at a mammographic screening service. Design: Retrospective analysis of data obtained by crossmatching the screening Service and the New South Wales Central Cancer Registry databases. Setting: The Central & Eastern Sydney Service of BreastScreen NSW. Participants: Women aged 40-69 years at first screen, who attended for their first or second screen between 1 March 1988 and 31 December 1992. Main outcome measures: Interval-cancer rates per 10 000 screens and as a proportion of the underlying incidence of breast cancer (as estimated by the underlying rate in the total NSW population). Results: The 12-month interval-cancer incidence per 10 000 screens was 4.17 for the 40-49 years age group (95% confidence interval [CI], 1.35-9.73) and 4.64 for the 50-69 years age group (95% CI, 2.47-7.94). Proportional incidence rates were 30.1% for the 40-49 years age group (95% CI, 9.8-70.3) and 22% for the 50-69 years age group (95% CI, 11.7-37.7). There was no significant difference between the proportional incidence rate for the 50-69 years age group for the Central & Eastern Sydney Service and those of major successful overseas screening trials. Conclusion: Screening quality was acceptable and should result in a significant mortality reduction in the screened population. Given the small number of cancers involved, comparison of interval-cancer statistics of mammographic screening programs with trials requires age-specific or age-adjusted data, and consideration of confidence intervals of both program and trial data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To evaluate the impact of increasing the minimum resupply period for prescriptions on the Pharmaceutical Benefits Scheme (PBS) in November 1994. The intervention was designed to reduce the stockpiling of medicines used for chronic medical conditions under the PBS safety net. Methods: Interrupted times series regression analyses were performed on 114 months of PBS drug utilisation data from January 1991 to June 2000. These analyses assessed whether there had been a significant interaction between the onset of the intervention in November 1994 and the extreme levels of drug utilisation in the months of December (peak utilisation) and January (lowest utilisation) respectively. Both serial and 12-month lag autocorrelations were controlled for. Results: The onset of the intervention was associated with a significant reduction in the December peak in drug utilisation; after the introduction of the policy there were 1,150,196 fewer proscriptions on average or that month (95% Cl 708,333-1,592,059). There was, however, no significant change in the low level of utilisation in January. The effect of the policy appears to be decreasing across successive postintervention years. though the odds of a prescription being dispensed in December remained significantly lower in 1999 compared to each of the pre-intervention years (11% vs. 14%) Conclusion: Analysis of the impact of increasing the re-supply period for PBS prescriptions showed that the magnitude of peak utilisation in December had been markedly reduced by the policy, though this effect appears to be decreasing over time. Continued monitoring and policy review is warranted in order to ensure that the initial effect of the intervention be maintained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the effects of four high-intensity interval-training (HIT) sessions performed over 2 weeks on peak volume of oxygen uptake (VO2peak), the first and second ventilatory thresholds (UT VT2) and peak power output (PPO) in highly trained cyclists. Fourteen highly trained male cyclists (VO2peak = 67.5 +/- 3.7 ml . kg(-1) . min(-1)) performed a ramped cycle test to determine VO2peak VT1 VT2, and PPO. Subjects were divided equally into a HIT group and a control group. The HIT group performed four HIT sessions (20 x 60 s at PPO, 120 s recovery); the V-02peak test was repeated <I wk after the HIT program. Control subjects maintained their regular training program and were reassessed under the same timeline. There was no change in V0(2peak) for either group; however, the HIT group showed a significantly greater increase in VT1, (+22% vs. -3%), VT2 (+15% vs. -1%), and PPO (+4.3 vs. -.4%) compared to controls (all P <.05). This study has demonstrated that HIT can improve VT1, VT2,, and PPO, following only four HIT sessions in already highly trained cyclists.