427 resultados para Checks
Resumo:
O trabalho foi desenvolvido na empresa Masterprojetos – soluções integradas, através da elaboração e desenvolvimento do projeto de estabilidade de um edifício de cinco pisos, em que um deles se encontra enterrado. Este edifício incorpora um bloco de habitação multifamiliar e um bloco de habitação unifamiliar e comércio. Tanto o edifício como a empresa estão localizados no Peso da Régua. O trabalho pode-se dividir em duas partes: Numa primeira parte estão descritas as funções desempenhadas e contextualizada a metodologia de trabalho. Nesta parte está ainda descrita a ferramenta de trabalho Cypecad, a qual não houve oportunidade de aprender a usar durante o meu percurso académico e desenvolveu-se capacidades de maneira autodidata. São ainda referidos alguns softwares disponíveis no mercado, para o mesmo ramo de atividade. Numa segunda parte encontram-se descritos os pressupostos considerados no pré-dimensionamento do edifício assim como os elementos utilizados para a formulação do modelo de cálculo. Estão ainda feitas algumas verificações analíticas para a validação do modelo tido em conta para o dimensionamento. Estabeleceu-se também uma comparação entre as soluções obtidas através do software Cypecad e através do cálculo analítico para uma laje maciça. No final estão descritas as conclusões que foram obtidas, das quais se salientam a atenção que é necessário ter, quando se utiliza softwares de cálculo automático, quanto à introdução de dados e soluções que se obtêm , assim como a poupança de tempo que este tipo de ferramenta proporciona. É de referir também neste ponto as oportunidades que o estágio me proporcionou, como a visita de alguns locais de obras, as atividades de medição e orçamento de obra e acompanhamento de projetos de outras especialidades como redes prediais, térmica, acústica e ITED (infra-estruturas de Telecomunicações em Edifícios).
Resumo:
O objetivo geral deste trabalho é a Análise do Desempenho na Administração Pública com recurso a Rácios Financeiros (Caso do Município de Matosinhos). Neste estudo iremos fazer uma análise económica e financeira do Município de Matosinhos avaliando seu o desempenho nos períodos de 2011 a 2014 e também, iremos analisar alguns fatores que influenciam a estrutura de capital dos 12 Municípios de grande dimensão e o seu desempenho. Quanto à análise económica e financeira do Município de Matosinhos, os resultados mostram, que a curto prazo é possível afirmar que o Município de Matosinhos se encontra numa situação favorável em termos de liquidez, com uma boa margem de segurança, ou seja, consegue solver os compromissos a curto. Verifica-se que o Município de Matosinhos ao longo do quadriénio foi recorrendo cada vez menos a capitais alheios para conseguir financiar os seus ativos, tendência positiva em termos do equilíbrio da estrutura financeira municipal. Tentando confirmar a existência ou inexistência de uma relação entre a estrutura de capital (endividamento) e o desempenho (rendibilidade do ativo) com os fatores que as influenciam, foi realizada uma análise de correlação não paramétrica de Spearman com recurso ao SPSS versão 21. Ao contrário da hipótese formulada e das conclusões chegadas em grande parte dos estudos efetuados, verifica-se a existência de uma relação negativa a um nível de significância de 5%, entre o nível de endividamento e a dimensão do Município. Quanto a relação o entre o endividamento com composição do ativo e a rendibilidade do ativo, os resultados não são satisfatórios, mostram uma inexistência da relação entre o endividamento e esses fatores. Verifica-se uma correlação positiva para um nível de significância de 1% entre a rendibilidade do ativo e crescimento, ou seja, os Municípios com maior taxa de crescimento apresentam uma maior rendibilidade do ativo. Este resultado confirma-se a nossa hipótese 4. Porém, em relação a associação positiva entre a rendibilidade do Município e a sua dimensão, os resultados evidenciaram uma inexistência de qualquer relação entre as variáveis.
Resumo:
This Work Project investigates the determinants of reelection using data on the 278 Portuguese mainland municipalities for the period 1976-2009. We implement a logit fixed effect model to control for the municipalities’ unobserved characteristics that remain constant over time. Political variables, such as the vote share of the incumbent’s party in previous election, the number of mayor’s consecutive mandates and abstention rate, are found to be relevant in explaining incumbent’s reelection. Moreover, as to the mayor’s individual characteristics, age and education contribute to explain reelection prospects. We also provide weak evidence that a higher degree of fiscal autonomy increases political turnover and that the good economic prospects of the municipality positively affect reelection. Finally, the residents’ level of education and the size of the municipal population have an explanatory power on mayor’s reelection. We perform several robustness checks to confirm these results.
Resumo:
The Common Tern (Sterna hirundo) is a ground nesting colonial seabird. Terns rely primarily on small prey fishes which they obtain through plunge diving for their survival as well as the survival of their offspring during the breeding season. The zebra mussel (Dreissena polymorpha) is a small bivalve mollusk that invaded North American waters in the late 1980's. Through its suspension feeding, the zebra mussel has the ability to alter the entire aquatic ecosystem, ultimately leading to a reduction in pelagic organisms including small prey fish. The objective of the study was to determine what (if any) indirect effects the invasion of the zebra mussel has had on fish prey captured by terns. The study took place in two separate two-year periods, 1990-91 and 1995-96 on a concrete breakwall off the north shore of Lake Erie near Port Colborne, Ontario. Daily nest checks revealed clutch initiation dates, egg-laying chronology, hatching success and morphological egg characteristics (length and breadth). Behavioural observations included time each sex spent in attendance with its brood, the frequency of feeding chicks and the prey species composition and size fed to chicks as well as to females (courtship feeding). Egg sizes did not differ between study periods, nor did feeding rates to chicks, suggesting that food was not a limiting resource. Terns spent less time with their broods (more time foraging) in the 1995-96 period. However, they also had significantly larger broods and fledged more offspring. The time of each individual foraging trip decreased, suggesting that fish were easier to obtain in 1995 and 1996. Lastly, kleptoparasitism rates decreased, suggesting that the costs of foraging (time, energy) actually decreased as fewer birds adopted this strategy to compensate for what I assumed to be a lack of available food (fish). The only significant difference between the periods of 1990, 1991 and 1995, 1996 was a change in diet. Terns delivered significantly fewer rainbow smelt and more emerald shiner in 1995 and 1996. However, the average size of fish delivered did not change. Thus, there was little impact on prey captured by Common Terns in Lake Erie since the invasion of the zebra mussel.
Resumo:
This study \Alas initiated in response to the Junior Division Review (1985) publ ished by the Ministry of Education for the Province of Ontario. Curriculum integration is an element used within the educational paradigm designed by the Ontario Ministry of Education. It is a term frequent1y verbal ized b>' educators in this province, but because of 1 imi ted resource support regarding this methodology, it was open to broad interpretation resulting in an extreme v ar i at i on i nit simp 1 eme n tat i on • I n de ed, the Min i s try intimated that it was not occurring to any significant degree across the province. The objective of this thes is was· to define integration in the junior classroom and de-:.ign a meas.ur·ement in-:.tr-ument which would in turn high 1 i gh t indicators of curriculum integration. The :.tudy made a prel iminary, field-based survey of educa tiona 1 professionals in order to generate a relevant description of integrated curr-iculum programm i ng as def i ned in the j un i or classroom. The description was a compilation of views expressed by a random selection of teachers, consultants, supervisory officers and principals. The survey revea 1 ed a much more comprehens i ve vi et·<,l of the attributes of integrated programming than tradition would dictate and resulted in a functional definition tha t was broader than past prac t ices. Based on the information generated by this survey, an instrument ou t 1 in i ng program cr iter i a of was devised. an integrated junior cla~·sroom Th i s measuremen t i nstrumen t , designed for all levels of educators, was named uThe Han~.son I nstrumen t for the Measuremen t of Program Integrat ion in the Jun i or Cl assroom". It refl ected five categories intrinsic to the me thodol ogy of integration: Teacher Behaviour, Student Behaviour, Classroom Layout, Cl as~·r oom Environment and Progr amm i ng. Each category and the items therein were successfully tested in val idi ty and rel iabi 1 i ty checKs. Interestingly, the individual class was found to be the major variable programming in in the measuremen t the j un i or d i vis i on • of The integrated instrument demonstrated potential not onl)' a~· an initial measure of the degree of integrated curriculum, but as a guide to strategies to implement such a methodology.
Resumo:
The effects of a complexly worded counterattitudinal appeal on laypeople's attitudes toward a legal issue were examined, using the Elaboration Likelihood Model (ELM) of persuasion as a theoretical framework. This model states that persuasion can result from the elaboration and scrutiny of the message arguments (i.e., central route processing), or can result from less cognitively effortful strategies, such as relying on source characteristics as a cue to message validity (i.e., peripheral route processing). One hundred and sixty-seven undergraduates (85 men and 81 women) listened to eitller a low status or high status source deliver a counterattitudinal speech on a legal issue. The speech was designed to contain strong or weak arguments. These arguments were 'worded in a simple and, therefore, easy to comprehend manner, or in a complex and, therefore, difficult to comprehend manner. Thus, there were three experimental manipulations: argument comprehensibility (easy to comprehend vs. difficult to comprehend), argumel11 strength (weak vs. strong), and source status (low vs. high). After listening to tIle speec.J] participants completed a measure 'of their attitude toward the legal issue, a thought listil1g task, an argument recall task,manipulation checks, measures of motivation to process the message, and measures of mood. As a result of the failure of the argument strength manipulation, only the effects of the comprehel1sibility and source status manipulations were tested. There was, however, some evidence of more central route processing in the easy comprehension condition than in the difficult comprehension condition, as predicted. Significant correlations were found between attitude and favourable and unfavourable thoughts about the legal issue with easy to comprehend arguments; whereas, there was a correlation only between attitude and favourable thoughts 11 toward the issue with difficult to comprehend arguments, suggesting, perhaps, that central route processing, \vhich involves argument scrutiny and elaboration, occurred under conditions of easy comprehension to a greater extent than under conditions of difficult comprehension. The results also revealed, among other findings, several significant effects of gender. Men had more favourable attitudes toward the legal issue than did women, men recalled more arguments from the speech than did women, men were less frustrated while listening to the speech than were ,vomen, and men put more effort into thinking about the message arguments than did women. When the arguments were difficult to comprehend, men had more favourable thoughts and fewer unfavourable thoughts about the legal issue than did women. Men and women may have had different affective responses to the issue of plea bargaining (with women responding more negatively than men), especially in light of a local and controversial plea bargain that occurred around the time of this study. Such pre-existing gender differences may have led to tIle lower frustration, the greater effort, the greater recall, and more positive attitudes for men than for WOlnen. Results· from this study suggest that current cognitive models of persuasion may not be very applicable to controversial issues which elicit strong emotional responses. Finally, these data indicate that affective responses, the controversial and emotional nature ofthe issue, gender and other individual differences are important considerations when experts are attempting to persuade laypeople toward a counterattitudinal position.
Resumo:
Eleanore Celeste has been to visit Arthur Schmon's parents. His father has not been feeling well and takes a week vacation per month. His mother worries because the checks Arthur sends do not come when they should. Eleanore Celeste requests that Arthur write to Washington to have it straightened out. She also mentions her family and who they are visiting over the next week. This letter is labelled number 114.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
Contexte et objectifs. En 1995, le gouvernement canadien a promulgué la Loi C-68, rendant ainsi obligatoire l’enregistrement de toutes les armes à feu et affermissant les vérifications auprès des futurs propriétaires. Faute de preuves scientifiques crédibles, le potentiel de cette loi à prévenir les homicides est présentement remis en question. Tout en surmontant les biais potentiels retrouvés dans les évaluations antérieures, l’objectif de ce mémoire est d’évaluer l’effet de la Loi C-68 sur les homicides au Québec entre 1974 et 2006. Méthodologie. L’effet de la Loi C-68 est évalué à l’aide d’une analyse des bornes extrêmes. Les effets immédiats et graduels de la Loi sont évalués à l’aide de 372 équations. Brièvement, il s’agit d’analyses de séries chronologiques interrompues où toutes les combinaisons de variables indépendantes sont envisagées afin d’éviter les biais relatifs à une spécification arbitraire des modèles. Résultats. L’introduction de la Loi C-68 est associée à une baisse graduelle des homicides commis à l’aide d’armes longues (carabines et fusils de chasse), sans qu’aucun déplacement tactique ne soit observé. Les homicides commis par des armes à feu à autorisation restreinte ou prohibées semblent influencés par des facteurs différents. Conclusion. Les résultats suggèrent que le contrôle des armes à feu est une mesure efficace pour prévenir les homicides. L’absence de déplacement tactique suggère également que l’arme à feu constitue un important facilitateur et que les homicides ne sont pas tous prémédités. D’autres études sont toutefois nécessaires pour clairement identifier les mécanismes de la Loi responsables de la baisse des homicides.
Resumo:
Nous proposons une nouvelle méthode pour quantifier la vorticité intracardiaque (vortographie Doppler), basée sur l’imagerie Doppler conventionnelle. Afin de caractériser les vortex, nous utilisons un indice dénommé « Blood Vortex Signature (BVS) » (Signature Tourbillonnaire Sanguine) obtenu par l’application d’un filtre par noyau basé sur la covariance. La validation de l’indice BVS mesuré par vortographie Doppler a été réalisée à partir de champs Doppler issus de simulations et d’expériences in vitro. Des résultats préliminaires obtenus chez des sujets sains et des patients atteints de complications cardiaques sont également présentés dans ce mémoire. Des corrélations significatives ont été observées entre la vorticité estimée par vortographie Doppler et la méthode de référence (in silico: r2 = 0.98, in vitro: r2 = 0.86). Nos résultats suggèrent que la vortographie Doppler est une technique d’échographie cardiaque prometteuse pour quantifier les vortex intracardiaques. Cet outil d’évaluation pourrait être aisément appliqué en routine clinique pour détecter la présence d’une insuffisance ventriculaire et évaluer la fonction diastolique par échocardiographie Doppler.
Resumo:
Le traitement de l’épilepsie chez le jeune enfant représente un enjeu majeur pour le développement de ce dernier. Chez la grande majorité des enfants atteints de spasmes infantiles et chez plusieurs atteints de crises partielles complexes réfractaires, le vigabatrin (VGB) représente un traitement incontournable. Cette médication, ayant démontré un haut taux d’efficacité chez cette population, semble toutefois mener à une atteinte du champ visuel périphérique souvent asymptomatique. L’évaluation clinique des champs visuels avec la périmétrie chez les patients de moins de neuf ans d’âge développemental est toutefois très difficile, voire impossible. Les études électrophysiologiques classiques menées auprès de la population épileptique pédiatrique suggèrent l’atteinte des structures liées aux cônes de la rétine. Les protocoles standards ne sont toutefois pas spécifiques aux champs visuels et les déficits soulignés ne concordent pas avec l’atteinte périphérique observée. Cette thèse vise donc à élaborer une tâche adaptée à l’évaluation des champs visuels chez les enfants en utilisant un protocole objectif, rapide et spécifique aux champs visuels à partir des potentiels évoqués visuels (PEVs) et à évaluer, à l’aide de cette méthode, les effets neurotoxiques à long terme du VGB chez des enfants épileptiques exposés en bas âge. La validation de la méthode est présentée dans le premier article. La stimulation est constituée de deux cercles concentriques faits de damiers à renversement de phase alternant à différentes fréquences temporelles. La passation de la tâche chez l’adulte permet de constater qu’une seule électrode corticale (Oz) est nécessaire à l’enregistrement simultané des réponses du champ visuel central et périphérique et qu’il est possible de recueillir les réponses électrophysiologiques très rapidement grâces l’utilisation de l’état-stationnaire (steady-state). La comparaison des données d’enfants et d’adultes normaux permet de constater que les réponses recueillies au sein des deux régions visuelles ne dépendent ni de l’âge ni du sexe. Les réponses centrales sont aussi corrélées à l’acuité visuelle. De plus, la validité de cette méthode est corroborée auprès d’adolescents ayant reçu un diagnostic clinique d’un déficit visuel central ou périphérique. En somme, la méthode validée permet d’évaluer adéquatement les champs visuels corticaux central et périphérique simultanément et rapidement, tant chez les adultes que chez les enfants. Le second article de cette thèse porte sur l’évaluation des champs visuels, grâce à la méthode préalablement validée, d’enfants épileptiques exposés au VGB en jeune âge en comparaison avec des enfants épileptiques exposés à d’autres antiépileptiques et à des enfants neurologiquement sains. La méthode a été bonifiée grâce à la variation du contraste et à l’enregistrement simultané de la réponse rétinienne. On trouve que la réponse corticale centrale est diminuée à haut et à moyen contrastes chez les enfants exposés au VGB et à haut contraste chez les enfants exposés à d’autres antiépileptiques. Le gain de contraste est altéré au sein des deux groupes d’enfants épileptiques. Par contre, l’absence de différences entre les deux groupes neurologiquement atteints ne permet pas de faire la distinction entre l’effet de la médication et celui de la maladie. De plus, la réponse rétinienne périphérique est atteinte chez les enfants épileptiques exposés au Sabril® en comparaison avec les enfants neurologiquement sains. La réponse rétinienne périphérique semble liée à la durée d’exposition à la médication. Ces résultats corroborent ceux rapportés dans la littérature. En somme, les résultats de cette thèse offrent une méthode complémentaire, rapide, fiable, objective à celles connues pour l’évaluation des champs visuels chez les enfants. Ils apportent aussi un éclairage nouveau sur les impacts à long terme possibles chez les enfants exposés au VGB dans la petite enfance.
Resumo:
Während zum Genehmigungsrecht zahlreiche Abhandlungen in der juristischen Literatur und eine große Zahl von Urteilen veröffentlicht sind, hält sich die Forschung und praktische Behandlung zum Aufsichtsrecht in Grenzen. Diese Arbeit vertieft das Verständnis, für die spezifische Verknüpfung und gegenseitige Abhängigkeit der Eröffnungskontrollen und der begleitenden Kontrollen im deutschen Arbeits-, Umwelt- und Verbraucherschutzrecht. Zentraler Punkt dieser Arbeit ist die Entwicklung von Grundlinien der begleitenden Aufsicht im Gewerbeaufsichtsrecht. Dazu ist es erforderlich die verschiedenen Randbedingungen gewerbeaufsichtlichen Handelns in den Blick zu nehmen. Es ist ein Blick auf die zu erwartende Ausbildung eines neuen Rationalitätstyps der gesellschaftlichen Entwicklung (2.), auf die Typisierung von Staats-, Gewerbe- und Wirtschaftsaufsicht und deren spezifischen Handlungsmustern, den festgestellten Defiziten des Aufsichtsrechts und des Aufsichtshandelns der begleitenden Kontrollen und den festgestellten tatsächlichen Wirkungen des Aufsichtssystems (3.) zu werfen. Weitere Einflüsse auf das Aufsichtsmodell der Zukunft kommen aus der erwarteten und wünschenswerten Entwicklung des Genehmigungsrechts (4.), der Art und Weise wie die begleitende Aufsicht gehandhabt werden kann und soll (5.) und den Privatisierungstypen und deren verfassungsrechtlichen Grenzen (6.). Die Arbeit schließt mit der Formulierung eines Zukunftsmodells, dass die Gewichte zwischen der Eröffnungs- und der begleitender Kontrolle, sowie das Verhältnis zwischen Staat, Privaten Dritten und Unternehmen, neu ordnet. Insgesamt wird in dieser Arbeit für ein Aufsichtsmodell plädiert, indem der Staat stärker in die Verantwortung für eine risikoorientierte begleitende Aufsicht genommen wird. Maßstäbe für die Risikoregelung sind künftig komplett vom Staat aufzustellen. Staatliche Aufsicht kann sich zukünftig auf Rahmenregelungen und Rahmenprüfungen bei den risikoreichsten Anlagen, Tätigkeiten und Produkten beschränken. Private Dritte können die Detailermittlungen und Freigaben bearbeiten. Private Sachverständige können künftig die Hauptlast detaillierter und zeitintensiver Kontrolltätigkeiten bei Anlagen, Tätigkeiten und Produkten mit mittlerem Risiko übernehmen. Anlagen, Tätigkeiten und Produkte mit geringem Risiko können der Eigenüberwachung überlassen werden. Im Gegenzug muss der Staat in deutlich stärkeren Maß die Aufsicht über die Kontrolleure anhand einheitlicher Maßstäbe organisieren und durchführen. Die Anforderungen an die Kontrolleure müssen künftig allgemein, sowohl für die externen Kontrollpersonen als auch für die internen Aufsichtspersonen, gelten und überprüft werden.
Resumo:
To study the behaviour of beam-to-column composite connection more sophisticated finite element models is required, since component model has some severe limitations. In this research a generic finite element model for composite beam-to-column joint with welded connections is developed using current state of the art local modelling. Applying mechanically consistent scaling method, it can provide the constitutive relationship for a plane rectangular macro element with beam-type boundaries. Then, this defined macro element, which preserves local behaviour and allows for the transfer of five independent states between local and global models, can be implemented in high-accuracy frame analysis with the possibility of limit state checks. In order that macro element for scaling method can be used in practical manner, a generic geometry program as a new idea proposed in this study is also developed for this finite element model. With generic programming a set of global geometric variables can be input to generate a specific instance of the connection without much effort. The proposed finite element model generated by this generic programming is validated against testing results from University of Kaiserslautern. Finally, two illustrative examples for applying this macro element approach are presented. In the first example how to obtain the constitutive relationships of macro element is demonstrated. With certain assumptions for typical composite frame the constitutive relationships can be represented by bilinear laws for the macro bending and shear states that are then coupled by a two-dimensional surface law with yield and failure surfaces. In second example a scaling concept that combines sophisticated local models with a frame analysis using a macro element approach is presented as a practical application of this numerical model.
Resumo:
We present a trainable system for detecting frontal and near-frontal views of faces in still gray images using Support Vector Machines (SVMs). We first consider the problem of detecting the whole face pattern by a single SVM classifer. In this context we compare different types of image features, present and evaluate a new method for reducing the number of features and discuss practical issues concerning the parameterization of SVMs and the selection of training data. The second part of the paper describes a component-based method for face detection consisting of a two-level hierarchy of SVM classifers. On the first level, component classifers independently detect components of a face, such as the eyes, the nose, and the mouth. On the second level, a single classifer checks if the geometrical configuration of the detected components in the image matches a geometrical model of a face.
Resumo:
Memory errors are a common cause of incorrect software execution and security vulnerabilities. We have developed two new techniques that help software continue to execute successfully through memory errors: failure-oblivious computing and boundless memory blocks. The foundation of both techniques is a compiler that generates code that checks accesses via pointers to detect out of bounds accesses. Instead of terminating or throwing an exception, the generated code takes another action that keeps the program executing without memory corruption. Failure-oblivious code simply discards invalid writes and manufactures values to return for invalid reads, enabling the program to continue its normal execution path. Code that implements boundless memory blocks stores invalid writes away in a hash table to return as the values for corresponding out of bounds reads. he net effect is to (conceptually) give each allocated memory block unbounded size and to eliminate out of bounds accesses as a programming error. We have implemented both techniques and acquired several widely used open source servers (Apache, Sendmail, Pine, Mutt, and Midnight Commander).With standard compilers, all of these servers are vulnerable to buffer overflow attacks as documented at security tracking web sites. Both failure-oblivious computing and boundless memory blocks eliminate these security vulnerabilities (as well as other memory errors). Our results show that our compiler enables the servers to execute successfully through buffer overflow attacks to continue to correctly service user requests without security vulnerabilities.