941 resultados para TTT and CCT diagrams


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect that breed standards and selective breeding practices have on the welfare of pedigree dogs has recently come under scrutiny from both the general public and scientific community. Recent research has suggested that breeding for particular aesthetic traits, such as tightly curled tails, highly domed skulls and short muzzles predisposes dogs with these traits to certain inherited defects, such as spina bifida, syringomyelia and brachycephalic airway obstruction syndrome, respectively. Further to this, there is a very large number of inherited diseases that are not related to breed standards, which are thought to be prevalent, partly as a consequence of inbreeding and restricted breeding pools. Inherited diseases, whether linked to conformation or not, have varying impact on the individuals affected by them, and affect varying proportions of the pedigree dog population. Some diseases affect few breeds but are highly prevalent in predisposed breeds. Other diseases affect many breeds, but have low prevalence within each breed. In this paper, we discuss the use of risk analysis and severity diagrams as means of mapping the overall problem of inherited disorders in pedigree dogs and, more specifically, the welfare impact of specific diseases in particular breeds.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In proposing theories of how we should design and specify networks of processes it is necessary to show that the semantics of any language we use to write down the intended behaviours of a system has several qualities. First in that the meaning of what is written on the page reflects the intention of the designer; second that there are no unexpected behaviours that might arise in a specified system that are hidden from the unsuspecting specifier; and third that the intention for the design of the behaviour of a network of processes can be communicated clearly and intuitively to others. In order to achieve this we have developed a variant of CSP, called CSPt, designed to solve the problems of termination of parallel processes present in the original formulation of CSP. In CSPt we introduced three parallel operators, each with a different kind of termination semantics, which we call synchronous, asynchronous and race. These operators provide specifiers with an expressive and flexible tool kit to define the intended behaviour of a system in such a way that unexpected or unwanted behaviours are guaranteed not to take place. In this paper we extend out analysis of CSPt and introduce the notion of an alphabet diagram that illustrates the different categories of events that can arise in the parallel composition of processes. These alphabet diagrams are then used to analyse networks of three processes in parallel with the aim of identifying sufficient constraints to ensure associativity of their parallel composition. Having achieved this we then proceed to prove associativity laws for the three parallel operators of CSPt. Next, we illustrate how to design and construct a network of three processes that satisfy the associativity law, using the associativity theorem and alphabet diagrams. Finally, we outline how this could be achieved for more general networks of processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire présente l’application de la méthode de décomposition en termes de diagrammes aux désintégrations de mésons B vers trois mésons de type pseudos- calaire ne comportant pas de quarks charmés. La décomposition diagrammatique des désintégrations de types B → Kππ, B → KKK ̄, B → KK ̄π et B → πππ est effectuée de façon systématique. Il est démontré que lorsque l’on néglige les dia- grammes d’échanges et d’annihilations, dont les contributions sont estimées être petites, de nouvelles relations apparaissent entre les amplitudes. Ces relations sont de nouveaux tests du modèle standard qui ne peuvent être obtenus que par la méthode diagrammatique. Lorsque les données nécessaires sont disponibles, nous vérifions ces relations et obtenons un bon accord avec les données expérimentales. Nous démontrons également qu’il est possible d’utiliser le secteur B → Kππ pour mesurer la phase faible γ avec une incertitude théorique que nous estimons être de l’ordre de 5%. Les autres secteurs de désintégrations ne permettent d’extraire des phases faibles que si l’on invoque des approximations de précisions inconnues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Justification: Le glaucome entraîne une perte progressive de la vision causée par la détérioration du nerf optique. Le glaucome est répandu dans le monde et cause la cécité dans environ sept millions de personnes. Le glaucome touche plus de 400 000 Canadiens et sa prévalence augmente avec le vieillissement de la population.1,2 Il s'agit d'une maladie chronique surnoise dont les symptômes se manifestent uniquement lors des stades avancés et qui peuvent mener à la cécité. Présentement, le seul moyen possible d’arrêter la progression du glaucome au stade initial est de diminuer la pression intra-oculaire (PIO). Les analogues de prostaglandines (APG) topiques sont fréquemment utilisées comme traitement de première ligne. Cependant, la recherche démontre que cette classe de médicaments peut changer certaines propriétés de la cornée, et possiblement influencer la mesure de la PIO.3 Objectif: À déterminer si l'utilisation d'APG affecte les propriétés biomécaniques de la cornée. La conclusion sera basée sur l'analyse intégrée des résultats obtenus de l'analyseur Reichert oculaire Réponse (ORA), la tonométrie par applanation de Goldmann (TAG) et la pachymétrie ultrasonographique. Le deuxième objectif potentiel de cette étude est de déterminer la corrélation, le cas échéant, entre les propriétés biomécaniques de la cornée, l'épaisseur de la cornée centrale (ECC) et la PIO chez les patients subissant un traitement d’APG topique. L'hypothèse principale de cette étude est que l’APG influence les propriétés de la cornée telles que l'épaisseur centrale, l'élasticité et la résistance. Patients et méthodes : Soixante-dix yeux de 35 patients, âgés de 50-85 ans, atteints de glaucome à angle ouvert (GAO) et traités avec APG topique ont été examinés. Seulement les sujets avec une réfraction manifeste entre -6,00 D et +4,25 D ont été inclus. Les critères d'exclusion sont: patients avec n'importe quelle autre maladie de la cornée de l’œil, telles que la dystrophie endothéliale de Fuch’s et kératocône, ou tout antécédent de traumatisme ou d'une chirurgie de la cornée, ainsi que le port de lentilles de contact. Nous avons demandé aux patients atteints du glaucome qui ont des paramètres stables et qui utilisent l’APG dans les deux yeux de cesser l’APG dans l'œil moins affecté par la PIO, et de continuer l’utilisation d’APG dans l'œil contralatéral. Le meilleur œil est défini comme celui avec moins de dommage sur le champ visuel (CV) (déviation moyenne (DM), le moins négatif) ou une PIO maximale historique plus basse si la DM est égale ou encore celui avec plus de dommage sur la tomographie par cohérence optique (TCO, Cirrus, CA) ou la tomographie confocale par balayage laser (HRT, Heidelberg, Allemagne). Toutes les mesures ont été prises avant la cessation d’APG et répétées 6 semaines après l’arrêt. Les patients ont ensuite recommencé l’utilisation d’APG et toutes les mesures ont été répétées encore une fois après une période supplémentaire de 6 semaines. Après commencer ou de changer le traitement du glaucome, le patient doit être vu environ 4-6 semaines plus tard pour évaluer l'efficacité de la goutte.4 Pour cette raison, on été décidé d'utiliser 6 semaines d'intervalle. Toutes les mesures ont été effectuées à l’institut du glaucome de Montréal par le même technicien, avec le même équipement et à la même heure de la journée. L'œil contralatéral a servi comme œil contrôle pour les analyses statistiques. La tonométrie par applanation de Goldmann a été utilisée pour mesurer la PIO, la pachymétrie ultrasonographique pour mesurer l'ECC, et l’ORA pour mesurer les propriétés biomécaniques de la cornée, incluant l'hystérèse cornéenne (HC). L’hypothèse de l'absence d'effet de l'arrêt de l’APG sur les propriétés biomécaniques a été examiné par un modèle linéaire à effets mixtes en utilisant le logiciel statistique R. Les effets aléatoires ont été définies à deux niveaux: le patient (niveau 1) et l'œil de chaque patient (niveau 2). Les effets aléatoires ont été ajoutés au modèle pour tenir compte de la variance intra-individuelle. L’âge a également été inclus dans le modèle comme variable. Les contrastes entre les yeux et les temps ont été estimés en utilisant les valeurs p ajustées pour contrôler les taux d'erreur internes de la famille en utilisant multcomp paquet dans R. Résultats: Une augmentation statistiquement significative due l 'HC a été trouvée entre les visites 1 (sur APG) et 2 (aucun APG) dans les yeux de l'étude, avec une moyenne (±erreur standard) des valeurs de 8,98 ± 0,29 mmHg et 10,35 ± 0,29 mmHg, respectivement, correspondant à une augmentation moyenne de 1,37 ± 0,18 mmHg (p <0,001). Une réduction significative de 1,25 ± 0,18 mmHg (p <0,001) a été observée entre les visites 2 et 3, avec une valeur moyenne HC finale de 9,09 ± 0,29 mmHg. En outre, une différence statistiquement significative entre l’oeil d’étude et le contrôle n'a été observée que lors de la visite 2 (1,01 ± 0,23 mmHg, p <0,001) et non lors des visites 1 et 3. Une augmentation statistiquement significative du facteur de résistance conréen (FRC) a été trouvée entre les visites 1 et 2 dans les yeux de l'étude, avec des valeurs moyennes de 10,23 ± 0,34 mmHg et 11,71 ± 0,34 mmHg, respectivement. Le FRC a ensuite été réduit de 1,90 ± 0,21 mmHg (p <0,001) entre les visites 2 et 3, avec une valeur moyenne FRC finale de 9,81 ± 0,34 mmHg. Une différence statistiquement significative entre l’oeil d’étude et le contrôle n'a été observée que lors de la visite 2 (1,46 ± 0,23 mmHg, p <0,001). Une augmentation statistiquement significative de l'ECC a été trouvée entre les visites 1 et 2 dans les yeux de l'étude, avec des valeurs moyennes de 541,83 ± 7,27 µm et 551,91 ± 7,27 µm, respectivement, ce qui correspond à une augmentation moyenne de 10,09 ± 0,94 µm (p <0,001). L'ECC a ensuite diminué de 9,40 ± 0,94 µm (p <0,001) entre les visites 2 et 3, avec une valeur moyenne finale de 542,51 ± 7,27 µm. Une différence entre l’étude et le contrôle des yeux n'a été enregistré que lors de la visite 2 (11,26 ± 1,79 µm, p <0,001). De même, on a observé une augmentation significative de la PIO entre les visites 1 et 2 dans les yeux de l'étude, avec des valeurs moyennes de 15,37 ± 0,54 mmHg et 18,37 ± 0,54 mmHg, respectivement, ce qui correspond à une augmentation moyenne de 3,0 ± 0,49 mmHg (p <0,001). Une réduction significative de 2,83 ± 0,49 mmHg (p <0,001) a été observée entre les visites 2 et 3, avec une valeur moyenne de la PIO finale de 15,54 ± 0,54 mmHg. L’oeil de contrôle et d’étude ne différaient que lors de la visite 2 (1,91 ± 0,49 mmHg, p <0,001), ce qui confirme l'efficacité du traitement de l’APG. Lors de la visite 1, le biais de la PIO (PIOcc - PIO Goldmann) était similaire dans les deux groupes avec des valeurs moyennes de 4,1 ± 0,54 mmHg dans les yeux de contrôles et de 4,8 ± 0,54 mmHg dans les yeux d’études. Lors de la visite 2, après un lavage de 6 semaines d’APG, le biais de la PIO dans l'œil testé a été réduit à 1,6 ± 0,54 mmHg (p <0,001), ce qui signifie que la sous-estimation de la PIO par TAG était significativement moins dans la visite 2 que de la visite 1. La différence en biais PIO moyenne entre l'étude et le contrôle des yeux lors de la visite 2, en revanche, n'a pas atteint la signification statistique (p = 0,124). On a observé une augmentation peu significative de 1,53 ± 0,60 mmHg (p = 0,055) entre les visites 2 et 3 dans les yeux de l'étude, avec une valeur de polarisation finale de la PIO moyenne de 3,10 ± 0,54 mmHg dans les yeux d'études et de 2,8 ± 0,54 mmHg dans les yeux de contrôles. Nous avons ensuite cherché à déterminer si une faible HC a été associée à un stade de glaucome plus avancé chez nos patients atteints du glaucome à angle ouvert traités avec l’APG. Lorsque l'on considère tous les yeux sur l’APG au moment de la première visite, aucune association n'a été trouvée entre les dommages sur le CV et l'HC. Cependant, si l'on considère seulement les yeux avec un glaucome plus avancé, une corrélation positive significative a été observée entre la DM et l'HC (B = 0,65, p = 0,003). Une HC inférieure a été associé à une valeur de DM de champ visuelle plus négative et donc plus de dommages liés au glaucome. Conclusions : Les prostaglandines topiques affectent les propriétés biomécaniques de la cornée. Ils réduisent l'hystérèse cornéenne, le facteur de résistance cornéen et l'épaisseur centrale de la cornée. On doit tenir compte de ces changements lors de l'évaluation des effets d’APG sur la PIO. Plus de recherche devrait être menées pour confirmer nos résultats. De plus, d’autres études pourraient être réalisées en utilisant des médicaments qui diminuent la PIO sans influencer les propriétés biomécaniques de la cornée ou à l'aide de tonomètre dynamique de Pascal ou similaire qui ne dépend pas des propriétés biomécaniques de la cornée. En ce qui concerne l'interaction entre les dommages de glaucome et l'hystérésis de la cornée, nous pouvons conclure qu' une HC inférieure a été associé à une valeur de DM de CV plus négative. Mots Clés glaucome - analogues de prostaglandines - hystérèse cornéenne – l’épaisseur de la cornée centrale - la pression intraoculaire - propriétés biomécaniques de la cornée.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of coupling two chaotic Nd:YAG lasers with intracavity KTP crystal for frequency doubling is numerically studied for the case of the laser operating in three longitudinal modes. It is seen that the system goes from chaotic to periodic and then to steady state as the coupling constant is increased. The intensity time series and phase diagrams are drawn and the Lyapunov characteristic exponent is calculated to characterize the chaotic and periodic regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes a methodology, a representation, and an implemented program for troubleshooting digital circuit boards at roughly the level of expertise one might expect in a human novice. Existing methods for model-based troubleshooting have not scaled up to deal with complex circuits, in part because traditional circuit models do not explicitly represent aspects of the device that troubleshooters would consider important. For complex devices the model of the target device should be constructed with the goal of troubleshooting explicitly in mind. Given that methodology, the principal contributions of the thesis are ways of representing complex circuits to help make troubleshooting feasible. Temporally coarse behavior descriptions are a particularly powerful simplification. Instantiating this idea for the circuit domain produces a vocabulary for describing digital signals. The vocabulary has a level of temporal detail sufficient to make useful predictions abut the response of the circuit while it remains coarse enough to make those predictions computationally tractable. Other contributions are principles for using these representations. Although not embodied in a program, these principles are sufficiently concrete that models can be constructed manually from existing circuit descriptions such as schematics, part specifications, and state diagrams. One such principle is that if there are components with particularly likely failure modes or failure modes in which their behavior is drastically simplified, this knowledge should be incorporated into the model. Further contributions include the solution of technical problems resulting from the use of explicit temporal representations and design descriptions with tangled hierarchies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These were slides developed as part of our work with the JISC Community Engagement Team and CETIS to introduce people to different forms of system modelling, including scenarios and personas, soft systems methods, UML (Use cases, activity diagrams and sequence diagrams), BMPN and EA modelling with Archimate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forecasting atmospheric blocking is one of the main problems facing medium-range weather forecasters in the extratropics. The European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) provides an excellent basis for medium-range forecasting as it provides a number of different possible realizations of the meteorological future. This ensemble of forecasts attempts to account for uncertainties in both the initial conditions and the model formulation. Since 18 July 2000, routine output from the EPS has included the field of potential temperature on the potential vorticity (PV) D 2 PV units (PVU) surface, the dynamical tropopause. This has enabled the objective identification of blocking using an index based on the reversal of the meridional potential-temperature gradient. A year of EPS probability forecasts of Euro-Atlantic and Pacific blocking have been produced and are assessed in this paper, concentrating on the Euro-Atlantic sector. Standard verification techniques such as Brier scores, Relative Operating Characteristic (ROC) curves and reliability diagrams are used. It is shown that Euro-Atlantic sector-blocking forecasts are skilful relative to climatology out to 10 days, and are more skilful than the deterministic control forecast at all lead times. The EPS is also more skilful than a probabilistic version of this deterministic forecast, though the difference is smaller. In addition, it is shown that the onset of a sector-blocking episode is less well predicted than its decay. As the lead time increases, the probability forecasts tend towards a model climatology with slightly less blocking than is seen in the real atmosphere. This small under-forecasting bias in the blocking forecasts is possibly related to a westerly bias in the ECMWF model. Copyright © 2003 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-consistent field theory (SCFT) is used to study the step edges that occur in thin films of lamellar-forming diblock copolymer, when the surfaces each have an affinity for one of the polymer components. We examine film morphologies consisting of a stack of ν continuous monolayers and one semi-infinite bilayer, the edge of which creates the step. The line tension of each step morphology is evaluated and phase diagrams are constructed showing the conditions under which the various morphologies are stable. The predicted behavior is then compared to experiment. Interestingly, our atomic force microscopy (AFM) images of terraced films reveal a distinct change in the character of the steps with increasing ν, which is qualitatively consistent with our SCFT phase diagrams. Direct quantitative comparisons are not possible because the SCFT is not yet able to probe the large polymer/air surface tensions characteristic of experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper pursues the study carried out in [ 10], focusing on the codimension one Hopf bifurcations in the hexagonal Watt governor system. Here are studied Hopf bifurcations of codimensions two, three and four and the pertinent Lyapunov stability coefficients and bifurcation diagrams. This allows to determine the number, types and positions of bifurcating small amplitude periodic orbits. As a consequence it is found an open region in the parameter space where two attracting periodic orbits coexist with an attracting equilibrium point.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semantic Analysis is a business analysis method designed to capture system requirements. While these requirements may be represented as text, the method also advocates the use of Ontology Charts to formally denote the system's required roles, relationships and forms of communication. Following model driven engineering techniques, Ontology Charts can be transformed to temporal Database schemas, class diagrams and component diagrams, which can then be used to produce software systems. A nice property of these transformations is that resulting system design models lend themselves to complicated extensions that do not require changes to the design models. For example, resulting databases can be extended with new types of data without the need to modify the database schema of the legacy system. Semantic Analysis is not widely used in software engineering, so there is a lack of experts in the field and no design patterns are available. This make it difficult for the analysts to pass organizational knowledge to the engineers. This study describes an implementation that is readily usable by engineers, which includes an automated technique that can produce a prototype from an Ontology Chart. The use of such tools should enable developers to make use of Semantic Analysis with minimal expertise of ontologies and MDA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Futures education (FE) in a rapidly changing world is critical if young people are to be empowered to be proactive rather than reactive about the future. Research into young people's images and ideas of the future lead to the disturbing conclusion that, for many, the future is a depressing and fearful place where they feel hopeless and disempowered. On the other hand, as Richard Slaughter writes, 'young people are passionately interested in their own futures, and that of the society in which they live. They universally 'jump at the chance to study something with such intrinsic interest that also intersects with their own life interests in so many ways'. FE explicitly attempts to build on this interest and counter these fears by offering a profound and empowering set of learning strategies and ideas that can help people think and act critically and creatively about the future, without necessarily trying to predict it. Futures educators have, over the past decades, developed useful tools, ideas and a language for use with students of all ages to enable them to develop foresight literacy. Most of us tend to view the future as somehow beyond the present and rarely consider how decisions and choices made today profoundly affect not just one fixed future but any number of futures. The underlying goal of FE is to move from the idea of a single, pre-determined future to that of many possible futures, so that students begin to see that they can determine the future, that they need not be reactive and that they are not powerless. How does one do that? Ideas include, but are not limited to: timelines and Y-diagrams, futures wheels and mind maps, and 'Preferable, possible and probable' futures - a.k.a. the 3Ps. Current Australian curricula present education about the future in various implicit or explicit guises. A plethora of statements and curriculum outcomes mention the future, but essentially take 'it' for granted, and are uninformed by FE literature, language, ideas or tools. Science, the humanities and technology tend to be the main areas where such an implicit futures focus can be found. It also appears in documents about vocational education, civics and lifelong learning. Explicit FE is, as Beare and Slaughter put it, still the missing dimension in education. Explicit FE attempts to develop futures literacy, and draws widely upon futures studies literature for processes and content. FE provides such a wide range of ideas and tools that it can be incorporated into education in any number of ways. Programs in two very different schools, one primary and one secondary, are described in this article to provide examples of some of these ways. The first school, Kimberley Park State Primary School in Brisbane, operates with multi-age classrooms based on a 'thinking curriculum' developed around four organisers: change, perspectives, interconnectedness and sustainability. The second school, St John's Grammar School in Adelaide, is an independent school where FE operates as an integrated approach in Year Seven, as a separate one-semester subject in Year Nine and in separate subjects at other levels. Teachers both at Kimberley Park and St John's are very positive about FE. They say it promotes valuable and authentic learning, assists students to realise they have choices that matter and helps them see that the future need not be all doom and gloom. Because students are interested in the Big Questions, as one teacher put it, FE provides a perfect opportunity to address them, and to consider values that are fundamental for them and the future of the planet. Like any innovation, the long-term success of FE in schools depends on an embedding process so that the innovation does not depend on the enthusiasm and energy of a few individuals, only to disappear when they move on. It requires strong leadership, teacher knowledge, support and enthusiasm, and the support and understanding of the wider school community.