918 resultados para Sequence Diagram


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Variation in hiring procedures occurs within fire service human resource departments. In this study, City 1 and City 2 applicants were required to pass their biophysical assessments prior to being hired as firefighters at the beginning and end of the screening process, respectively. City 1 applicants demonstrated significantly lower resting heart rate (RHR), resting diastolic blood pressure (RDBP), body fat% (BF) and higher z-scores for BF, trunk flexibility (TF) and overall clinical assessment (p<0.05). Regression analysis found that age and conducting the biophysical assessment at the end of the screening process explained poorer biophysical assessment results in BF% (R2=21%), BF z-score (R2=22%), TF z-score (R2=10%) and overall clinical assessment z-score (R2=7%). Each of RHR (OR=1.06, CI=1.01-1.10), RDBP (OR=1.05, CI=1.00-1.11) and BF% (OR=1.20, CI=1.07-1.37) increased the odds of being a City 2 firefighter (p<0.05). Biophysical screening at the end of the hiring process may result in the hiring of a less healthy firefighter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complete genome of an Erwinia amylovora bacteriophage, vB_EamM_Ea35-70 (Ea35-70), is 271,084 bp, encodes 318 putative proteins, and contains one tRNA. Comparative analysis with other Myoviridae genomes suggests that Ea35-70 is related to the Phikzlikevirus genus within the family Myoviridae, since 26% of Ea35-70 proteins share homology to proteins in Pseudomonas phage φKZ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A diagram of a horse, labelled and illustrated by Dorothy Rungeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagram of the north crib float. This document is stained, n.d.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagram of lot no. 10 in Willoughby. The names on the outside of this document include: Matthew Singh, provincial land surveyor, Toronto; George S. Field, contractor, Niagara Falls; E. T. Phelps and H. Lyman, lawyer, Niagara Falls. The document is quite warn and fragile. This does not affect the text, n.d.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagram of Lot 142 and Lot 186 showing the line of the road in red, Dec.10, 1856.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagram of the waste weir, n.d.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ce mémoire s’intéresse au système binaire massif CV Serpentis, composé d’une Wolf- Rayet riche en carbone et d’une étoile de la séquence principale, de type spectral O (WC8d + O8-9IV). D’abord, certains phénomènes affectant les étoiles massives sont mentionnés, de leur passage sur la séquence principale à leur mort (supernova). Au cours du premier cha- pitre, un rappel est fait concernant certaines bases de l’astrophysique stellaire observa- tionnelle (diagramme Hertzsprung-Russell, phases évolutives, etc...). Au chapitre suivant, un des aspects les plus importants de la vie des étoiles massives est abordé : la perte de masse sous forme de vents stellaires. Un historique de la découverte des vents ouvre le chapitre, suivi des fondements théoriques permettant d’expliquer ce phénomène. Ensuite, différents aspects propres aux vents stellaires sont présentés. Au troisième chapitre, un historique détaillé de CV Ser est présenté en guise d’introduc- tion à cet objet singulier. Ses principales caractéristiques connues y sont mentionnées. Finalement, le cœur de ce mémoire se retrouve au chapitre 4. Des courbes de lumière ultra précises du satellite MOST (2009 et 2010) montrent une variation apparente du taux de perte de masse de la WR de l’ordre de 62% sur une période orbitale de 29.701 jours. L’analyse des résidus permet de trouver une signature suggérant la présence de régions d’interaction en corotation (en anglais corotating interaction regions, ou CIR) dans le vent WR. Une nouvelle solution orbitale est présentée ainsi que les paramètres de la région de collision des vents et les types spectraux sont confirmés.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present investigation, an attempt is made to study late Quaternary foraminiferal and pteropod records of the shelf of northern Kerala and to evaluate their potentiality in paleocenographic and paleoclimatic reconstruction. The study gives details of sediment cores, general characteristics of foraminifera and pteropod species recorded from the examined samples and their systematic classification, spatial distribution of Recent foraminifera and pteropods and their response to varying bathymetry, nature of substrate, organic matter content in sediment and hydrography across the shelf. An attempt is also made to establish an integrated chronostratigraphy for the examined core sections. An effort is also made to identify microfaunal criteria useful in biostratigraphic division in shallow marine core sections. An attempt is made to infer various factors responsible for the change in microfaunal assemblage. Reconstruction of sea level changes during the last 36,000 years was attempted based on the pteropod record. The study reveals a bathymetric control on benthic/planktic (BF/PF) foraminiferal and pteropods/planktic foraminiferal (Pt/PF) abundance ratio. Bathymetric distribution pattern of BF/PF ratio is opposite to the (Pt/PF) ratio with decreasing trend of former from the shore across the shelf. Quantitative benthic foraminiferal record in the surficial sediments reveals a positive correlation between the diversity and bathymetry. R-mode cluster analysis performed on 30n significant Recent benthic foraminiferal, determines three major assemblage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses our research in developing a generalized and systematic method for anomaly detection. The key ideas are to represent normal program behaviour using system call frequencies and to incorporate probabilistic techniques for classification to detect anomalies and intrusions. Using experiments on the sendmail system call data, we demonstrate that concise and accurate classifiers can be constructed to detect anomalies. An overview of the approach that we have implemented is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Code clones are portions of source code which are similar to the original program code. The presence of code clones is considered as a bad feature of software as the maintenance of software becomes difficult due to the presence of code clones. Methods for code clone detection have gained immense significance in the last few years as they play a significant role in engineering applications such as analysis of program code, program understanding, plagiarism detection, error detection, code compaction and many more similar tasks. Despite of all these facts, several features of code clones if properly utilized can make software development process easier. In this work, we have pointed out such a feature of code clones which highlight the relevance of code clones in test sequence identification. Here program slicing is used in code clone detection. In addition, a classification of code clones is presented and the benefit of using program slicing in code clone detection is also mentioned in this work.