355 resultados para Correctness
Resumo:
Most parallel computing applications in highperformance computing use the Message Passing Interface (MPI) API. Given the fundamental importance of parallel computing to science and engineering research, application correctness is paramount. MPI was originally developed around 1993 by the MPI Forum, a group of vendors, parallel programming researchers, and computational scientists. However, the document defining the standard is not issued by an official standards organization but has become a de facto standard © 2011 ACM.
Resumo:
We propose a dynamic verification approach for large-scale message passing programs to locate correctness bugs caused by unforeseen nondeterministic interactions. This approach hinges on an efficient protocol to track the causality between nondeterministic message receive operations and potentially matching send operations. We show that causality tracking protocols that rely solely on logical clocks fail to capture all nuances of MPI program behavior, including the variety of ways in which nonblocking calls can complete. Our approach is hinged on formally defining the matches-before relation underlying the MPI standard, and devising lazy update logical clock based algorithms that can correctly discover all potential outcomes of nondeterministic receives in practice. can achieve the same coverage as a vector clock based algorithm while maintaining good scalability. LLCP allows us to analyze realistic MPI programs involving a thousand MPI processes, incurring only modest overheads in terms of communication bandwidth, latency, and memory consumption. © 2011 IEEE.
Resumo:
The use of efficient synchronization mechanisms is crucial for implementing fine grained parallel programs on modern shared cache multi-core architectures. In this paper we study this problem by considering Single-Producer/Single- Consumer (SPSC) coordination using unbounded queues. A novel unbounded SPSC algorithm capable of reducing the row synchronization latency and speeding up Producer-Consumer coordination is presented. The algorithm has been extensively tested on a shared-cache multi-core platform and a sketch proof of correctness is presented. The queues proposed have been used as basic building blocks to implement the FastFlow parallel framework, which has been demonstrated to offer very good performance for fine-grain parallel applications. © 2012 Springer-Verlag.
Resumo:
Dynamic Voltage and Frequency Scaling (DVFS) exhibits fundamental limitations as a method to reduce energy consumption in computing systems. In the HPC domain, where performance is of highest priority and codes are heavily optimized to minimize idle time, DVFS has limited opportunity to achieve substantial energy savings. This paper explores if operating processors Near the transistor Threshold Volt- age (NTV) is a better alternative to DVFS for break- ing the power wall in HPC. NTV presents challenges, since it compromises both performance and reliability to reduce power consumption. We present a first of its kind study of a significance-driven execution paradigm that selectively uses NTV and algorithmic error tolerance to reduce energy consumption in performance- constrained HPC environments. Using an iterative algorithm as a use case, we present an adaptive execution scheme that switches between near-threshold execution on many cores and above-threshold execution on one core, as the computational significance of iterations in the algorithm evolves over time. Using this scheme on state-of-the-art hardware, we demonstrate energy savings ranging between 35% to 67%, while compromising neither correctness nor performance.
Resumo:
Northern Ireland is emerging from violence but still living with conflict. The recent flags protests in Belfast represent a challenge to public administration to transcend the contested politics of local government in Northern Ireland and to navigate a way through a symbolic legacy issue. This article draws on a longitudinal hermeneutic analysis of empirical research conducted on Northern Ireland local government over a decade, where these concerns dominated much debate. Additional analysis of the research findings reveals broader problems applicable to any public administration faced with managing situations in which good governance in public participation and procedural correctness operates alongside fundamental political disagreement and distrust. These conclusions are particularly pertinent for local administrations in societies transitioning from conflict.
Resumo:
A multiuser scheduling multiple-input multiple-output (MIMO) cognitive radio network (CRN) with space-time block coding (STBC) is considered in this paper, where one secondary base station (BS) communicates with one secondary user (SU) selected from K candidates. The joint impact of imperfect channel state information (CSI) in BS → SUs and BS → PU due to channel estimation errors and feedback delay on the outage performance is firstly investigated. We obtain the exact outage probability expressions for the considered network under the peak interference power IP at PU and maximum transmit power Pm at BS which cover perfect/imperfect CSI scenarios in BS → SUs and BS → PU. In addition, asymptotic expressions of outage probability in high SNR region are also derived from which we obtain several important insights into the system design. For example, only with perfect CSIs in BS → SUs, i.e., without channel estimation errors and feedback delay, the multiuser diversity can be exploited. Finally, simulation results confirm the correctness of our analysis.
Resumo:
Anualmente ocorrem cerca de 16 milhões AVCs em todo o mundo. Cerca de metade dos sobreviventes irá apresentar défice motor que necessitará de reabilitação na janela dos 3 aos 6 meses depois do AVC. Nos países desenvolvidos, é estimado que os custos com AVCs representem cerca de 0.27% do Produto Interno Bruto de cada País. Esta situação implica um enorme peso social e financeiro. Paradoxalmente a esta situação, é aceite na comunidade médica a necessidade de serviços de reabilitação motora mais intensivos e centrados no doente. Na revisão do estado da arte, demonstra-se o arquétipo que relaciona metodologias terapêuticas mais intensivas com uma mais proficiente reabilitação motora do doente. Revelam-se também as falhas nas soluções tecnológicas existentes que apresentam uma elevada complexidade e custo associado de aquisição e manutenção. Desta forma, a pergunta que suporta o trabalho de doutoramento seguido inquire a possibilidade de criar um novo dispositivo de simples utilização e de baixo custo, capaz de apoiar uma recuperação motora mais eficiente de um doente após AVC, aliando intensidade com determinação da correcção dos movimentos realizados relativamente aos prescritos. Propondo o uso do estímulo vibratório como uma ferramenta proprioceptiva de intervenção terapêutica a usar no novo dispositivo, demonstra-se a tolerabilidade a este tipo de estímulos através do teste duma primeira versão do sistema apenas com a componente de estimulação num primeiro grupo de 5 doentes. Esta fase validará o subsequente desenvolvimento do sistema SWORD. Projectando o sistema SWORD como uma ferramenta complementar que integra as componentes de avaliação motora e intervenção proprioceptiva por estimulação, é descrito o desenvolvimento da componente de quantificação de movimento que o integra. São apresentadas as diversas soluções estudadas e o algoritmo que representa a implementação final baseada na fusão sensorial das medidas provenientes de três sensores: acelerómetro, giroscópio e magnetómetro. O teste ao sistema SWORD, quando comparado com o método de reabilitação tradicional, mostrou um ganho considerável de intensidade e qualidade na execução motora para 4 dos 5 doentes testados num segundo grupo experimental. É mostrada a versatilidade do sistema SWORD através do desenvolvimento do módulo de Tele-Reabilitação que complementa a componente de quantificação de movimento com uma interface gráfica de feedback e uma ferramenta de análise remota da evolução motora do doente. Finalmente, a partir da componente de quantificação de movimento, foi ainda desenvolvida uma versão para avaliação motora automatizada, implementada a partir da escala WMFT, que visa retirar o factor subjectivo da avaliação humana presente nas escalas de avaliação motora usadas em Neurologia. Esta versão do sistema foi testada num terceiro grupo experimental de cinco doentes.
Resumo:
This paper continues a systematic approach to build natural deduction calculi and corresponding proof procedures for non-classical logics. Our attention is now paid to the framework of paraconsistent logics. These logics are used, in particular, for reasoning about systems where paradoxes do not lead to the `deductive explosion', i.e., where formulae of the type `A follows from false', for any A, are not valid. We formulate the natural deduction system for the logic PCont, explain its main concepts, define a proof searching technique and illustrate it by examples. The presentation is accompanied by demonstrating the correctness of these developments.
Resumo:
Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.
Resumo:
Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.
Resumo:
This paper aims to present a contrastive approach between three different ways of building concepts after proving the similar syntactic possibilities that coexist in terms. However, from the semantic point of view we can see that each language family has a different distribution in meaning. But the most important point we try to show is that the differences found in the psychological process when communicating concepts should guide the translator and the terminologist in the target text production and the terminology planning process. Differences between languages in the information transmission process are due to the different roles the different types of knowledge play. We distinguish here the analytic-descriptive knowledge and the analogical knowledge among others. We also state that none of them is the best when determining the correctness of a term, but there has to be adequacy criteria in the selection process. This concept building or term building success is important when looking at the linguistic map of the information society.
Resumo:
Impurity free eluission spectra of HCCCHO and DCCCHO have been rephotographed using the electronic-energy-exchange method with benzene as a carrier gas. The near ultraviolet spectra of ReeCHO and DCCCHO were photographed in a sorption under conditions of high resolution with absorption path lengths up to 100 meters. The emission and absorption spectra of Propynal resulting from 3 n 1 t 1\ - A excitation has been reanalyzed in som.e detail. Botrl of the eH out-of-plane wagging modes were found to have negative anharmonicity. A barrier height of 56.8/0.0 cm- 1 and a nonplanar oft , , equilibrium angle of 17 3 /30 are calculated for the V 10/ lJ 11 modes. The in-plane and out-of-plane v1. brational modes in the 3A." and 1a~. ' elec ronic states of Propynal were subjected to a normal coordinate treatment in the approximat :on of tIle Urey-Bradley force field. From the relative oscillator strengths of the trans1·t1·0ns connect i ng t he v ibrat1•0n1ess lA' , state and t,he V1· bron1·C 3· if levels of the A state, the differences in equilibrium configuration were evaluated from an approximate Franck-Condon analysis based on the ground state normal coordinates. As this treatment gave 512 possible geometrical structures for the upper state, it 4 was necessary to resort to a comparison of the observed and calculated moments of inertia along with chemical intuition to isolate the structure. A test of the correctness of the calculated structure change and the vibrational assignment was raade by evaluating the intensities of the inplane and out-oi-plane fundarnental, sequence, and cross sequellce transitions y the exact Franck-Condon method.
Resumo:
Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.
Resumo:
The meeting notes include an audit for "year ending June 30th, 1880, thence to May, 25th, 1881". There is also an "auditor's report" by Richard Tew and P. Corridi which states: "To the President and shareholders of the Ontario Grape Growing and Wine Manufacturing Company, Barnsdale, Ont. Gentlemen, We the undersigned have much pleasure in informing you that we have completed our audit of the Company's Books for the year ending June 30th, 1880, thence to May 24th, 1881, and can testify to their correctness. In future we would recommend that the Books be made up to May 24th in each year, five weeks previous to the Annual meeting being held, and that the Day Book kept by the manager be submitted to the Secretary monthly, together with all vouchers, so that the transactions can be duly recorded in the Company's Books. We beg to congratulate the shareholders on the satisfactory exhibit of the Company's affairs, as shown by the annexed Balance Sheet. We are Gentlemen, yours respectfully." This is the first audit included in the book.