884 resultados para Standard and Eurocode design fire curves
Resumo:
We present the treatment rationale and study design of the MetLung phase III study. This study will investigate onartuzumab (MetMAb) in combination with erlotinib compared with erlotinib alone, as second- or third-line treatment, in patients with advanced non-small-cell lung cancer (NSCLC) who are Met-positive by immunohistochemistry. Approximately 490 patients (245 per treatment arm) will receive erlotinib (150 mg oral daily) plus onartuzumab or placebo (15 mg/kg intravenous every 3 weeks) until disease progression, unacceptable toxicity, patient or physician decision to discontinue, or death. The efficacy objectives of this study are to compare overall survival (OS) (primary endpoint), progression-free survival, and response rates between the 2 treatment arms. In addition, safety, quality of life, pharmacokinetics, and translational research will be investigated across treatment arms. If the primary objective (OS) is achieved, this study will provide robust results toward an alternative treatment option for patients with Met-positive second- or third-line NSCLC. © 2012 Elsevier Inc. All Rights Reserved.
Resumo:
This paper explores the literature and analyses the different uses and understandings of the word “design” in Portuguese colonised countries, using Brazil as the main example. It investigates the relationship between the linguistic existence of terms to define and describe “design” as an activity and field, and the roles and perceptions of Design by the general society. It also addresses the effects that the lack of a proper translation causes on the local community from a cultural point of view. The current perception of Design in Portuguese colonies is associated to two main aspects: linguistic and historical. Both of them differentiate the countries taken into consideration from other countries that have a different background. The changes associated to the meaning of “design” throughout the years, caused a great impact on the perceptions that people have about Design. On the other hand, the development of Design has also influenced the changes on the meaning of the term, as a result of the legacy from the colonisation period and also as a characteristic of the Portuguese language. Design has developed and reached a level of excellence in Portuguese colonised countries that competes with the most traditional Design cultures in the world. However, this level of Design is enmeshed into an elite belonging to universities and specialised markets, therefore Design is not democratised. The ultimate aim of this study is to promote discussions on how to make the discourse surrounding this area more accessible to people from non-English speaking countries that do not have the word “design” in their local language.
Resumo:
Light gauge steel frame (LSF) floor systems are generally made of lipped channel section joists and lined with gypsum plasterboards to provide adequate fire resistance rating under fire conditions. Recently a new LSF floor system made of welded hollow flange channel (HFC) section was developed and its fire performance was investigated using full scale fire tests. The new floor systems gave higher fire resistance ratings in comparison to conventional LSF floor systems. To avoid expensive and time consuming full scale fire tests, finite element analyses were also performed to simulate the fire performance of LSF floors made of HFC joists using both steady and transient state methods. This paper presents the details of the developed finite element models of HFC joists to simulate the structural fire performance of the LSF floor systems under standard fire conditions. Finite element analyses were performed using the measured time–temperature profiles of the failed joists from the fire tests, and their failure times, temperatures and modes, and deflection versus time curves were obtained. The developed finite element models successfully predicted the structural performance of LSF floors made of HFC joists under fire conditions. They were able to simulate the complex behaviour of thin cold-formed steel joists subjected to non-uniform temperature distributions, and local buckling and yielding effects. This study also confirmed the superior fire performance of the newly developed LSF floors made of HFC joists.
Resumo:
179 p.
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation resulting from fire or other incident? In the wake of major maritime disasters such as the Scandinavian Star, Herald of Free Enterprise, Estonia and in light of the growth in the numbers of high density high-speed ferries and large capacity cruise ships, issues concerning the evacuation of passengers and crew at sea are receiving renewed interest. Fire and evacuation models with features such as the ability to realistically simulate the spread of fire and fire suppression systems and the human response to fire sas well as the capability to model human performance in heeled orientations linked to a virtual reality environment that produces realistic visualisations of modelled scenarios are now available and can be used to aid the engineer in assessing ship design and procedures. This paper describes the maritmeEXODUS ship evacuation and the SMARTFIRE fire simulation model and provides an example application demonstrating the use of the models in performing fire and evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033. The fire simulations include the action of a water mist system.
Resumo:
The inclusion of the Discrete Wavelet Transform in the JPEG-2000 standard has added impetus to the research of hardware architectures for the two-dimensional wavelet transform. In this paper, a VLSI architecture for performing the symmetrically extended two-dimensional transform is presented. This architecture conforms to the JPEG-2000 standard and is capable of near-optimal performance when dealing with the image boundaries. The architecture also achieves efficient processor utilization. Implementation results based on a Xilinx Virtex-2 FPGA device are included.
Resumo:
This paper presents the results of a full-scale site fire test performed on a cold-formed steel portal frame building with semi-rigid joints. The purpose of the study is to establish a performance-based approach for the design of such structures in fire boundary conditions. In the full-scale site fire test, the building collapsed asymmetrically at a temperature of 714°C. A non-linear elasto-plastic finite-element shell model is described and is validated against the results of the full-scale test. A parametric study is presented that highlights the importance of in-plane restraint from the side rails in preventing an outwards sway failure for both a single portal and full building geometry model. The study also demonstrates that the semi-rigidity of the joints should be taken into account in the design. The single portal and full building geometry models display a close match to site test results with failure at 682°C and 704°C, respectively. A design case is described in accordance with Steel Construction Institute design recommendations. The validated single portal model is tested with pinned bases, columns protected, realistic loading and rafters subject to symmetric uniform heating in accordance with the ISO 834 standard fire curve; failure occurs at 703°C.
Resumo:
Building secure systems is difficult for many reasons. This paper deals with two of the main challenges: (i) the lack of security expertise in development teams, and (ii) the inadequacy of existing methodologies to support developers who are not security experts. The security standard ISO 14508 (Common Criteria) together with secure design techniques such as UMLsec can provide the security expertise, knowledge, and guidelines that are needed. However, security expertise and guidelines are not stated explicitly in the Common Criteria. They are rather phrased in security domain terminology and difficult to understand for developers. This means that some general security and secure design expertise are required to fully take advantage of the Common Criteria and UMLsec. In addition, there is the problem of tracing security requirements and objectives into solution design,which is needed for proof of requirements fulfilment. This paper describes a security requirements engineering methodology called SecReq. SecReq combines three techniques: the Common Criteria, the heuristic requirements editorHeRA, andUMLsec. SecReqmakes systematic use of the security engineering knowledge contained in the Common Criteria and UMLsec, as well as security-related heuristics in the HeRA tool. The integrated SecReq method supports early detection of security-related issues (HeRA), their systematic refinement guided by the Common Criteria, and the ability to trace security requirements into UML design models. A feedback loop helps reusing experiencewithin SecReq and turns the approach into an iterative process for the secure system life-cycle, also in the presence of system evolution.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.
Resumo:
The demand for new telecommunication services requiring higher capacities, data rates and different operating modes have motivated the development of new generation multi-standard wireless transceivers. A multi-standard design often involves extensive system level analysis and architectural partitioning, typically requiring extensive calculations. In this research, a decimation filter design tool for wireless communication standards consisting of GSM, WCDMA, WLANa, WLANb, WLANg and WiMAX is developed in MATLAB® using GUIDE environment for visual analysis. The user can select a required wireless communication standard, and obtain the corresponding multistage decimation filter implementation using this toolbox. The toolbox helps the user or design engineer to perform a quick design and analysis of decimation filter for multiple standards without doing extensive calculation of the underlying methods.
Resumo:
A ligand-based drug design study was performed to acetaminophen regioisomers as analgesic candidates employing quantum chemical calculations at the DFT/B3LYP level of theory and the 6-31G* basis set. To do so, many molecular descriptors were used such as highest occupied molecular orbital, ionization potential, HO bond dissociation energies, and spin densities, which might be related to quench reactivity of the tyrosyl radical to give N-acetyl-p-benzosemiquinone-imine through an initial electron withdrawing or hydrogen atom abstraction. Based on this in silico work, the most promising molecule, orthobenzamol, was synthesized and tested. The results expected from the theoretical prediction were confirmed in vivo using mouse models of nociception such as writhing, paw licking, and hot plate tests. All biological results suggested an antinociceptive activity mediated by opioid receptors. Furthermore, at 90 and 120 min, this new compound had an effect that was comparable to morphine, the standard drug for this test. Finally, the pharmacophore model is discussed according to the electronic properties derived from quantum chemistry calculations.
Resumo:
The new Brazilian ABNT NBR 15575 Standard (the ―Standard‖) recommends two methods for analyzing housing thermal performance: a simplified and a computational simulation method. The aim of this paper is to evaluate both methods and the coherence between each. For this, the thermal performance of a low-cost single-family house was evaluated through the application of the procedures prescribed by the Standard. To accomplish this study, the EnergyPlus software was selected. Comparative analyses of the house with varying envelope U-values and solar absorptance of external walls were performed in order to evaluate the influence of these parameters on the results. The results have shown limitations in the current Standard computational simulation method, due to different aspects: weather files, lack of consideration of passive strategies, and inconsistency with the simplified method. Therefore, this research indicates that there are some aspects to be improved in this Standard, so it could better represent the real thermal performance of social housing in Brazil.
Resumo:
Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.
Resumo:
A broadband primary standard for thermal noise measurements is presented and its thermal and electromagnetic behaviour is analysed by means of a novel hybrid analytical?numerical simulation methodology. The standard consists of a broadband termination connected to a 3.5mm coaxial airline partially immersed in liquid nitrogen and is designed in order to obtain a low reflectivity and a low uncertainty in the noise temperature. A detailed sensitivity analysis is made in order to highlight the critical characteristics that mostly affect the uncertainty in the noise temperature, and also to determine the manufacturing and operation tolerances for a proper performance in the range 10MHz to 26.5 GHz. Aspects such as the thermal bead design, the level of liquid nitrogen or the uncertainties associated with the temperatures, the physical properties of the materials in the standard and the simulation techniques are discussed.