946 resultados para Formal Verification Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal tools like finite-state model checkers have proven useful in verifying the correctness of systems of bounded size and for hardening single system components against arbitrary inputs. However, conventional applications of these techniques are not well suited to characterizing emergent behaviors of large compositions of processes. In this paper, we present a methodology by which arbitrarily large compositions of components can, if sufficient conditions are proven concerning properties of small compositions, be modeled and completely verified by performing formal verifications upon only a finite set of compositions. The sufficient conditions take the form of reductions, which are claims that particular sequences of components will be causally indistinguishable from other shorter sequences of components. We show how this methodology can be applied to a variety of network protocol applications, including two features of the HTTP protocol, a simple active networking applet, and a proposed web cache consistency algorithm. We also doing discuss its applicability to framing protocol design goals and to representing systems which employ non-model-checking verification methodologies. Finally, we briefly discuss how we hope to broaden this methodology to more general topological compositions of network applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To compare the performance of formal prognostic instruments vs subjective clinical judgment with regards to predicting functional outcome in patients with spontaneous intracerebral hemorrhage (ICH). METHODS: This prospective observational study enrolled 121 ICH patients hospitalized at 5 US tertiary care centers. Within 24 hours of each patient's admission to the hospital, one physician and one nurse on each patient's clinical team were each asked to predict the patient's modified Rankin Scale (mRS) score at 3 months and to indicate whether he or she would recommend comfort measures. The admission ICH score and FUNC score, 2 prognostic scales selected for their common use in neurologic practice, were calculated for each patient. Spearman rank correlation coefficients (r) with respect to patients' actual 3-month mRS for the physician and nursing predictions were compared against the same correlation coefficients for the ICH score and FUNC score. RESULTS: The absolute value of the correlation coefficient for physician predictions with respect to actual outcome (0.75) was higher than that of either the ICH score (0.62, p = 0.057) or the FUNC score (0.56, p = 0.01). The nursing predictions of outcome (r = 0.72) also trended towards an accuracy advantage over the ICH score (p = 0.09) and FUNC score (p = 0.03). In an analysis that excluded patients for whom comfort care was recommended, the 65 available attending physician predictions retained greater accuracy (r = 0.73) than either the ICH score (r = 0.50, p = 0.02) or the FUNC score (r = 0.42, p = 0.004). CONCLUSIONS: Early subjective clinical judgment of physicians correlates more closely with 3-month outcome after ICH than prognostic scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phrase “not much mathematics required” can imply a variety of skill levels. When this phrase is applied to computer scientists, software engineers, and clients in the area of formal specification, the word “much” can be widely misinterpreted with disastrous consequences. A small experiment in reading specifications revealed that students already trained in discrete mathematics and the specification notation performed very poorly; much worse than could reasonably be expected if formal methods proponents are to be believed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: MicroRNAs (miRNAs) play a global role in regulating gene expression and have important tissue-specific functions. Little is known about their role in the retina. The purpose of this study was to establish the retinal expression of those miRNAs predicted to target genes involved in vision. METHODS: miRNAs potentially targeting important "retinal" genes, as defined by expression pattern and implication in disease, were predicted using a published algorithm (TargetScan; Envisioneering Medical Technologies, St. Louis, MO). The presence of candidate miRNAs in human and rat retinal RNA was assessed by RT-PCR. cDNA levels for each miRNA were determined by quantitative PCR. The ability to discriminate between miRNAs varying by a single nucleotide was assessed. The activity of miR-124 and miR-29 against predicted target sites in Rdh10 and Impdh1 was tested by cotransfection of miRNA mimics and luciferase reporter plasmids. RESULTS: Sixty-seven miRNAs were predicted to target one or more of the 320 retinal genes listed herein. All 11 candidate miRNAs tested were expressed in the retina, including miR-7, miR-124, miR135a, and miR135b. Relative levels of individual miRNAs were similar between rats and humans. The Rdh10 3'UTR, which contains a predicted miR-124 target site, mediated the inhibition of luciferase activity by miR-124 mimics in cell culture. CONCLUSIONS: Many miRNAs likely to regulate genes important for retinal function are present in the retina. Conservation of miRNA retinal expression patterns from rats to humans supports evidence from other tissues that disruption of miRNAs is a likely cause of a range of visual abnormalities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Functional and non-functional concerns require different programming effort, different techniques and different methodologies when attempting to program efficient parallel/distributed applications. In this work we present a "programmer oriented" methodology based on formal tools that permits reasoning about parallel/distributed program development and refinement. The proposed methodology is semi-formal in that it does not require the exploitation of highly formal tools and techniques, while providing a palatable and effective support to programmers developing parallel/distributed applications, in particular when handling non-functional concerns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aim of this study was to investigate the effect of pre-treatment verification imaging with megavoltage (MV) X-rays on cancer and normal cell survival in vitro and to compare the findings with theoretically modelled data. Since the dose received from pre-treatment imaging can be significant, incorporation of this dose at the planning stage of treatment has been suggested.

Methods: The impact of imaging dose incorporation on cell survival was investigated by clonogenic assay, irradiating DU-145 prostate cancer, H460 non-small cell lung cancer and AGO-1522b normal tissue fibroblast cells. Clinically relevant imaging-to-treatment times of 7.5 minutes and 15 minutes were chosen for this study. The theoretical magnitude of the loss of radiobiological efficacy due to sublethal damage repair was investigated using the Lea-Catcheside dose protraction factor model.

Results: For the cell lines investigated, the experimental data showed that imaging dose incorporation had no significant impact upon cell survival. These findings were in close agreement with the theoretical results.

Conclusions: For the conditions investigated, the results suggest that allowance for the imaging dose at the planning stage of treatment should not adversely affect treatment efficacy.

Advances in Knowledge: There is a paucity of data in the literature on imaging effects in radiotherapy. This paper presents a systematic study of imaging dose effects on cancer and normal cell survival, providing both theoretical and experimental evidence for clinically relevant imaging doses and imaging-to-treatment times. The data provide a firm foundation for further study into this highly relevant area of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background

]In modern radiotherapy, it is crucial to monitor the performance of all linac components including gantry, collimation system and electronic portal imaging device (EPID) during arc deliveries. In this study, a simple EPID-based measurement method has been introduced in conjunction with an algorithm to investigate the stability of these systems during arc treatments with the aim of ensuring the accuracy of linac mechanical performance.


Methods

The Varian EPID sag, gantry sag, changes in source-to-detector distance (SDD), EPID and collimator skewness, EPID tilt, and the sag in MLC carriages as a result of linac rotation were separately investigated by acquisition of EPID images of a simple phantom comprised of 5 ball-bearings during arc delivery. A fast and robust software package was developed for automated analysis of image data. Twelve Varian linacs of different models were investigated.


Results

The average EPID sag was within 1 mm for all tested linacs. All machines showed less than 1 mm gantry sag. Changes in SDD values were within 1.7 mm except for three linacs of one centre which were within 9 mm. Values of EPID skewness and tilt were negligible in all tested linacs. The maximum sag in MLC leaf bank assemblies was around 1 mm. The EPID sag showed a considerable improvement in TrueBeam linacs.


Conclusion

The methodology and software developed in this study provide a simple tool for effective investigation of the behaviour of linac components with gantry rotation. It is reproducible and accurate and can be easily performed as a routine test in clinics.




Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese dout., Matemática, Investigação Operacional, Universidade do Algarve, 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study sought to explore ways to work with a group of young people through an arts-based approach to the teaching of literacy. Through the research, the author integrated her own reflexivity applying arts methods over the past decade. The author’s past experiences were strongly informed by theories such as caring theory and maternal pedagogy, which also informed the research design. The study incorporated qualitative data collection instruments comprising interviews, journals, sketches, artifacts, and teacher field notes. Data were collected by 3 student participants for the duration of the research. Study results provide educators with data on the impact of creating informal and alternative ways to teach literacy and maintain student engagement with resistant learners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'utilisation des méthodes formelles est de plus en plus courante dans le développement logiciel, et les systèmes de types sont la méthode formelle qui a le plus de succès. L'avancement des méthodes formelles présente de nouveaux défis, ainsi que de nouvelles opportunités. L'un des défis est d'assurer qu'un compilateur préserve la sémantique des programmes, de sorte que les propriétés que l'on garantit à propos de son code source s'appliquent également au code exécutable. Cette thèse présente un compilateur qui traduit un langage fonctionnel d'ordre supérieur avec polymorphisme vers un langage assembleur typé, dont la propriété principale est que la préservation des types est vérifiée de manière automatisée, à l'aide d'annotations de types sur le code du compilateur. Notre compilateur implante les transformations de code essentielles pour un langage fonctionnel d'ordre supérieur, nommément une conversion CPS, une conversion des fermetures et une génération de code. Nous présentons les détails des représentation fortement typées des langages intermédiaires, et les contraintes qu'elles imposent sur l'implantation des transformations de code. Notre objectif est de garantir la préservation des types avec un minimum d'annotations, et sans compromettre les qualités générales de modularité et de lisibilité du code du compilateur. Cet objectif est atteint en grande partie dans le traitement des fonctionnalités de base du langage (les «types simples»), contrairement au traitement du polymorphisme qui demande encore un travail substantiel pour satisfaire la vérification de type.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presently different audio watermarking methods are available; most of them inclined towards copyright protection and copy protection. This is the key motive for the notion to develop a speaker verification scheme that guar- antees non-repudiation services and the thesis is its outcome. The research presented in this thesis scrutinizes the field of audio water- marking and the outcome is a speaker verification scheme that is proficient in addressing issues allied to non-repudiation to a great extent. This work aimed in developing novel audio watermarking schemes utilizing the fun- damental ideas of Fast-Fourier Transform (FFT) or Fast Walsh-Hadamard Transform (FWHT). The Mel-Frequency Cepstral Coefficients (MFCC) the best parametric representation of the acoustic signals along with few other key acoustic characteristics is employed in crafting of new schemes. The au- dio watermark created is entirely dependent to the acoustic features, hence named as FeatureMark and is crucial in this work. In any watermarking scheme, the quality of the extracted watermark de- pends exclusively on the pre-processing action and in this work framing and windowing techniques are involved. The theme non-repudiation provides immense significance in the audio watermarking schemes proposed in this work. Modification of the signal spectrum is achieved in a variety of ways by selecting appropriate FFT/FWHT coefficients and the watermarking schemes were evaluated for imperceptibility, robustness and capacity char- acteristics. The proposed schemes are unequivocally effective in terms of maintaining the sound quality, retrieving the embedded FeatureMark and in terms of the capacity to hold the mark bits. Robust nature of these marking schemes is achieved with the help of syn- chronization codes such as Barker Code with FFT based FeatureMarking scheme and Walsh Code with FWHT based FeatureMarking scheme. An- other important feature associated with this scheme is the employment of an encryption scheme towards the preparation of its FeatureMark that scrambles the signal features that helps to keep the signal features unreve- laed. A comparative study with the existing watermarking schemes and the ex- periments to evaluate imperceptibility, robustness and capacity tests guar- antee that the proposed schemes can be baselined as efficient audio water- marking schemes. The four new digital audio watermarking algorithms in terms of their performance are remarkable thereby opening more opportu- nities for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontologies have been established for knowledge sharing and are widely used as a means for conceptually structuring domains of interest. With the growing usage of ontologies, the problem of overlapping knowledge in a common domain becomes critical. In this short paper, we address two methods for merging ontologies based on Formal Concept Analysis: FCA-Merge and ONTEX. --- FCA-Merge is a method for merging ontologies following a bottom-up approach which offers a structural description of the merging process. The method is guided by application-specific instances of the given source ontologies. We apply techniques from natural language processing and formal concept analysis to derive a lattice of concepts as a structural result of FCA-Merge. The generated result is then explored and transformed into the merged ontology with human interaction. --- ONTEX is a method for systematically structuring the top-down level of ontologies. It is based on an interactive, top-down- knowledge acquisition process, which assures that the knowledge engineer considers all possible cases while avoiding redundant acquisition. The method is suited especially for creating/merging the top part(s) of the ontologies, where high accuracy is required, and for supporting the merging of two (or more) ontologies on that level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los métodos disponibles para realizar análisis de descomposición que se pueden aplicar cuando los datos son completamente observados, no son válidos cuando la variable de interés es censurada. Esto puede explicar la escasez de este tipo de ejercicios considerando variables de duración, las cuales se observan usualmente bajo censura. Este documento propone un método del tipo Oaxaca-Blinder para descomponer diferencias en la media en el contexto de datos censurados. La validez de dicho método radica en la identificación y estimación de la distribución conjunta de la variable de duración y un conjunto de covariables. Adicionalmente, se propone un método más general que permite descomponer otros funcionales de interés como la mediana o el coeficiente de Gini, el cual se basa en la especificación de la función de distribución condicional de la variable de duración dado un conjunto de covariables. Con el fin de evaluar el desempeño de dichos métodos, se realizan experimentos tipo Monte Carlo. Finalmente, los métodos propuestos son aplicados para analizar las brechas de género en diferentes características de la duración del desempleo en España, tales como la duración media, la probabilidad de ser desempleado de largo plazo y el coeficiente de Gini. Los resultados obtenidos permiten concluir que los factores diferentes a las características observables, tales como capital humano o estructura del hogar, juegan un papel primordial para explicar dichas brechas.