954 resultados para Calculus of operations.


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Matemática em Rede Nacional - IBILCE

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work shows a novel fractal dimension method for shape analysis. The proposed technique extracts descriptors from a shape by applying a multi-scale approach to the calculus of the fractal dimension. The fractal dimension is estimated by applying the curvature scale-space technique to the original shape. By applying a multi-scale transform to the calculus, we obtain a set of descriptors which is capable of describing the shape under investigation with high precision. We validate the computed descriptors in a classification process. The results demonstrate that the novel technique provides highly reliable descriptors, confirming the efficiency of the proposed method. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4757226]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La ricerca inquadra all’interno dell’opera dell’autore, lo specifico tema della residenza. Esso costituisce il campo di applicazione del progetto di architettura, in cui più efficacemente ricercare i tratti caratteristici del metodo progettuale dell’architetto, chiave di lettura dello studio proposto. Il processo che giunge alla costituzione materiale dell’architettura, viene considerato nelle fasi in cui è scomposto, negli strumenti che adotta, negli obbiettivi che si pone, nel rapporto con i sistemi produttivi, per come affronta il tema della forma e del programma e confrontato con la vasta letteratura presente nel pensiero di alcuni autori vicini a Ignazio Gardella. Si definiscono in tal modo i tratti di una metodologia fortemente connotata dal realismo, che rende coerente una ricerca empirica e razionale, legata ad un’idea di architettura classica, di matrice illuministica e attenta alle istanze della modernità, all’interno della quale si realizza l’eteronomia linguistica che caratterizza uno dei tratti caratteristici delle architetture di Ignazio Gardella; aspetto più volte interpretato come appartenenza ai movimenti del novecento, che intersecano costantemente la lunga carriera dell’architetto. L’analisi dell’opera della residenza è condotta non per casi esemplari, ma sulla totalità dei progetti che si avvale anche di contributi inediti. Essa è intesa come percorso di ricerca personale sui processi compositivi e sull’uso del linguaggio e permette un riposizionamento della figura di Gardella, in relazione al farsi dell’architettura, della sua realizzazione e non alla volontà di assecondare stili o norme a-priori. E’ la dimensione pratica, del mestiere, quella che meglio si presta all’interpretazione dei progetti di Gardella. Le residenze dell’architetto si mostrano per la capacità di adattarsi ai vincoli del luogo, del committente, della tecnologia, attraverso la re-interpretazione formale e il trasferimento da un tema all’altro, degli elementi essenziali che mostrano attraverso la loro immagine, una precisa idea di casa e di architettura, non autoriale, ma riconoscibile e a-temporale.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents a universal model of documents and deltas. This model formalize what it means to find differences between documents and to shows a single shared formalization that can be used by any algorithm to describe the differences found between any kind of comparable documents. The main scientific contribution of this thesis is a universal delta model that can be used to represent the changes found by an algorithm. The main part of this model are the formal definition of changes (the pieces of information that records that something has changed), operations (the definitions of the kind of change that happened) and deltas (coherent summaries of what has changed between two documents). The fundamental mechanism tha makes the universal delta model a very expressive tool is the use of encapsulation relations between changes. In the universal delta model, changes are not always simple records of what has changed, they can also be combined into more complex changes that reflects the detection of more meaningful modifications. In addition to the main entities (i.e., changes, operations and deltas), the model describes and defines also documents and the concept of equivalence between documents. As a corollary to the model, there is also an extensible catalog of possible operations that algorithms can detect, used to create a common library of operations, and an UML serialization of the model, useful as a reference when implementing APIs that deal with deltas. The universal delta model presented in this thesis acts as the formal groundwork upon which algorithm can be based and libraries can be implemented. It removes the need to recreate a new delta model and terminology whenever a new algorithm is devised. It also alleviates the problems that toolmakers have when adapting their software to new diff algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A permutation is said to avoid a pattern if it does not contain any subsequence which is order-isomorphic to it. Donald Knuth, in the first volume of his celebrated book "The art of Computer Programming", observed that the permutations that can be computed (or, equivalently, sorted) by some particular data structures can be characterized in terms of pattern avoidance. In more recent years, the topic was reopened several times, while often in terms of sortable permutations rather than computable ones. The idea to sort permutations by using one of Knuth’s devices suggests to look for a deterministic procedure that decides, in linear time, if there exists a sequence of operations which is able to convert a given permutation into the identical one. In this thesis we show that, for the stack and the restricted deques, there exists an unique way to implement such a procedure. Moreover, we use these sorting procedures to create new sorting algorithms, and we prove some unexpected commutation properties between these procedures and the base step of bubblesort. We also show that the permutations that can be sorted by a combination of the base steps of bubblesort and its dual can be expressed, once again, in terms of pattern avoidance. In the final chapter we give an alternative proof of some enumerative results, in particular for the classes of permutations that can be sorted by the two restricted deques. It is well-known that the permutations that can be sorted through a restricted deque are counted by the Schrӧder numbers. In the thesis, we show how the deterministic sorting procedures yield a bijection between sortable permutations and Schrӧder paths.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES AND METHODS: Gender differences regarding 17 childhood experiences, thought to have traumatising potential (Traumatic Childhood Experiences = TCE), and pain behaviour in adulthood were assessed using a self-administered, anonymously filled-out questionnaire. Patients were consecutively accrued in the offices of practicing physicians. Three research questions were formulated: 1) Are specific TCE reported more frequently in male and female patients with the diagnosis "Pain Associated with Psychological Factors" (PP), compared to patients with "Pain, explained by Organic Processes" (OP), and "Patients with Diseases without Pain" (OD)? 2) Do PP-men and PP-women differ in reporting TCE?; 3) Are specific TCE correlated with Pain Duration, -Intensity and Number of Operations? RESULTS: 1). TCE occurred more frequently in PP-men and PP-women compared to OP- and OD-patients. 2). The PP-women reported much more TCE-items than the PP-men. 3). Duration and Intensity of adult pain associated with psychological factors correlated with certain TCE-items. CONCLUSIONS: The three research questions can be answered by "yes". In patients with pain which has been impossible to diagnose and/or has resisted conventional forms of therapy, TCE (verbal, physical and sexually abusive) have to be looked for, because they often explain adult pain. Unnecessary examinations and surgery can be avoided and therapies can be tailored for the individual patient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Theodor Kocher, surgeon and Nobel laureate, has influenced thyroid surgery all over the world: his treatment for multinodular goiter-subtotal thyroidectomy-has been the "Gold Standard" for more than a century. However, based on a new understanding of molecular growth mechanisms in goitrogenesis, we set out to evaluate if a more extended resection yields better results. METHODS: Four thousand three hundred and ninety-four thyroid gland operations with 5,785 "nerves at risk" were prospectively analyzed between 1972 and 2002. From 1972 to 1990, the limited Kocher resections were performed, and from 1991 to 2002 a more radical resection involving at least a hemithyroidectomy was performed. RESULTS: The incidence of postoperative nerve palsy was 3.6%; in the first study period and 0.9%; in the second (P < 0.001, Fisher's exact). Postoperative hypoparathyroidism decreased from 3.2%; in the first period to 0.64%; in the second (P < 0.01). The rate of reoperation for recurrent disease was 11.1%; from 1972 to 1990 and 8.5%; from 1991 to 2002 (P < 0.01). CONCLUSIONS: Extended resection for multinodular goiter not only significantly reduced morbidity, but also decreased the incidence of operations for recurrent disease. Our findings in a large cohort corroborate the suggestions that Kocher's approach should be replaced by a more radical resection, which actually was his original intention more than 130 years ago.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since product take-back is mandated in Europe, and has effects for producers worldwide including the U.S., designing efficient forward and reverse supply chain networks is becoming essential for business viability. Centralizing production facilities may reduce costs but perhaps not environmental impacts. Decentralizing a supply chain may reduce transportation environmental impacts but increase capital costs. Facility location strategies of centralization or decentralization are tested for companies with supply chains that both take back and manufacture products. Decentralized and centralized production systems have different effects on the environment, industry and the economy. Decentralized production systems cluster suppliers within the geographical market region that the system serves. Centralized production systems have many suppliers spread out that meet all market demand. The point of this research is to help further the understanding of company decision-makers about impacts to the environment and costs when choosing a decentralized or centralized supply chain organizational strategy. This research explores; what degree of centralization for a supply chain makes the most financial and environmental sense for siting facilities; and which factories are in the best location to handle the financial and environmental impacts of particular processing steps needed for product manufacture. This research considered two examples of facility location for supply chains when products are taken back; the theoretical case involved shoe resoling and a real world case study considered the location of operations for a company that reclaims multiple products for use as material inputs. For the theoretical example a centralized strategy to facility location was optimal: whereas for the case study a decentralized strategy to facility location was best. In conclusion, it is not possible to say that a centralized or decentralized strategy to facility location is in general best for a company that takes back products. Each company’s specific concerns, needs, and supply chain details will determine which degree of centralization creates the optimal strategy for siting their facilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This report deals with a bentonite deposit recently developed, approximately seven miles northeast of Warm Springs, Montana. A group of claims have been staked on the deposit and are owned by the Lincoln Mining Company of Anaconda, Montana. The company also has several claims prospected for silver one mile from its present site of operations, but the silver prospects have failed to produce. The bentonite deposit was discovered incidentally during the course of other development work, and at present two adits have been driven into the side of a mountain, each crosscutting a vein-like mass of bentonite varying from two to three feet in width.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Der Artikel behandelt das Projektieren der Produkt-Service-Verbindung vom Standpunkt der Informationsintegration aus. Der Autor erläutert grundlegende Unterschiede zwischen dem traditionellen und dem modernen Operationsmanagementkonzept. Ergänzend wird die Rolle der logistischen Unterstüzungsanalyse wird betrachtet. Der Artikel stellt das Konzept von CALS (Continuous Acquisition and Life cycle Support) dar, welches als Umgebung, die Datenverteilung zwischen den in den Entwicklungsprozess beteiligten Geschäftspartnern ermöglicht.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.