505 resultados para Catastrophic Misinterpretation
Resumo:
Climate change and continuous urbanization contribute to an increased urban vulnerability towards flooding. Only relying on traditional flood control measures is recognized as inadequate, since the damage can be catastrophic if flood controls fail. The idea of a flood-resilient city – one which can withstand or adapt to a flood event without being harmed in its functionality – seems promising. But what does resilience actually mean when it is applied to urban environments exposed to flood risk, and how can resilience be achieved? This paper presents a heuristic framework for assessing the flood resilience of cities, for scientists and policy-makers alike. It enriches the current literature on flood resilience by clarifying the meaning of its three key characteristics – robustness, adaptability and transformability – and identifying important components to implement resilience strategies. The resilience discussion moves a step forward, from predominantly defining resilience to generating insight into “doing” resilience in practice. The framework is illustrated with two case studies from Hamburg, showing that resilience, and particularly the underlying notions of adaptability and transformability, first and foremost require further capacity-building among public as well as private stakeholders. The case studies suggest that flood resilience is currently not enough motivation to move from traditional to more resilient flood protection schemes in practice; rather, it needs to be integrated into a bigger urban agenda.
Resumo:
The study of the Upper Jurassic-Lower Cretaceous deposits (Higueruelas, Villar del Arzobispo and Aldea de Cortés Formations) of the South Iberian Basin (NW Valencia, Spain) reveals new stratigraphic and sedimentological data, which have significant implications on the stratigraphic framework, depositional environments and age of these units. The Higueruelas Fm was deposited in a mid-inner carbonate platform where oncolitic bars migrated by the action of storms and where oncoid production progressively decreased towards the uppermost part of the unit. The overlying Villar del Arzobispo Fm has been traditionally interpreted as an inner platform-lagoon evolving into a tidal-flat. Here it is interpreted as an inner-carbonate platform affected by storms, where oolitic shoals protected a lagoon, which had siliciclastic inputs from the continent. The Aldea de Cortés Fm has been previously interpreted as a lagoon surrounded by tidal-flats and fluvial-deltaic plains. Here it is reinterpreted as a coastal wetland where siliciclastic muddy deposits interacted with shallow fresh to marine water bodies, aeolian dunes and continental siliciclastic inputs. The contact between the Higueruelas and Villar del Arzobispo Fms, classically defined as gradual, is also interpreted here as rapid. More importantly, the contact between the Villar del Arzobispo and Aldea de Cortés Fms, previously considered as unconformable, is here interpreted as gradual. The presence of Alveosepta in the Villar del Arzobispo Fm suggests that at least part of this unit is Kimmeridgian, unlike the previously assigned Late Tithonian-Middle Berriasian age. Consequently, the underlying Higueruelas Fm, previously considered Tithonian, should not be younger than Kimmeridgian. Accordingly, sedimentation of the Aldea de Cortés Fm, previously considered Valangian-Hauterivian, probably started during the Tithonian and it may be considered part of the regressive trend of the Late Jurassic-Early Cretaceous cycle. This is consistent with the dinosaur faunas, typically Jurassic, described in the Villar del Arzobispo and Aldea de Cortés Fms.
Resumo:
Several landforms found in the fold-and-thrust belt area of Central Precordillera, Pre-Andes of Argentina, which were often associated with tectonic efforts, are in fact related to non-tectonic processes or gravitational superficial structures. These second-order structures, interpreted as gravitational collapse structures, have developed in the western flank of sierras de La Dehesa and Talacasto. These include rock-slides, rock falls, wrinkle folds, slip sheets and flaps, among others; which together constitute a monoclinal fold dipping between 30º and 60º to the west. Gravity collapse structures are parallel to the regional strike of the Sierra de la Dehesa and are placed in Ordovician limestones and dolomites. Their sloping towards the west, the presence of bed planes, fractures and joints; and the lithology (limestone interbedded with incompetent argillaceous banks) would have favored their occurrence. Movement of the detached structures has been controlled by lithology characteristics, as well as by bedding and joints. Detachment and initial transport of gravity collapse structures and rockslides in the western flank of the Sierra de la Dehesa were tightly controlled by three structural elements: 1) sliding surfaces developed on parallel bedded strata when dipping >30° in the slope direction; 2) Joint’s sets constitute lateral and transverse traction cracks which release extensional stresses and 3) Discontinuities fragmenting sliding surfaces. Some other factors that could be characterized as local (lithology, structure and topography) and as regional (high seismic activity and possibly wetter conditions during the postglacial period) were determining in favoring the steady loss of the western mountain side in the easternmost foothills of Central Precordillera.
Resumo:
Nano-scale touch screen thin film have not been thoroughly investigated in terms of dynamic impact analysis under various strain rates. This research is focused on two different thin films, Zinc Oxide (ZnO) film and Indium Tin Oxide (ITO) film, deposited on Polyethylene Terephthalate (PET) substrate for the standard touch screen panels. Dynamic Mechanical Analysis (DMA) was performed on the ZnO film coated PET substrates. Nano-impact (fatigue) testing was performed on ITO film coated PET substrates. Other analysis includes hardness and the elastic modulus measurements, atomic force microscopy (AFM), Fourier Transform Infrared Spectroscopy (FTIR) and the Scanning Electron Microscopy (SEM) of the film surface.
Ten delta of DMA is described as the ratio of loss modulus (viscous properties) and storage modulus (elastic properties) of the material and its peak against time identifies the glass transition temperature (Tg). Thus, in essence the Tg recognizes changes from glassy to rubber state of the material and for our sample ZnO film, Tg was found as 388.3 K. The DMA results also showed that the Ten delta curve for Tg increases monotonically in the viscoelastic state (before Tg) and decreases sharply in the rubber state (after Tg) until recrystallization of ZnO takes place. This led to an interpretation that enhanced ductility can be achieved by negating the strength of the material.
For the nano-impact testing using the ITO coated PET, the damage started with the crack initiation and propagation. The interpretation of the nano-impact results depended on the characteristics of the loading history. Under the nano-impact loading, the surface structure of ITO film suffered from several forms of failure damages that range from deformation to catastrophic failures. It is concluded that in such type of application, the films should have low residual stress to prevent deformation, good adhesive strength, durable and good resistance to wear.
Resumo:
This paper synthesizes and discusses the spatial and temporal patterns of archaeological sites in Ireland, spanning the Neolithic period and the Bronze Age transition (4300–1900 cal BC), in order to explore the timing and implications of the main changes that occurred in the archaeological record of that period. Large amounts of new data are sourced from unpublished developer-led excavations and combined with national archives, published excavations and online databases. Bayesian radiocarbon models and context- and sample-sensitive summed radiocarbon probabilities are used to examine the dataset. The study captures the scale and timing of the initial expansion of Early Neolithic settlement and the ensuing attenuation of all such activity—an apparent boom-and-bust cycle. The Late Neolithic and Chalcolithic periods are characterised by a resurgence and diversification of activity. Contextualisation and spatial analysis of radiocarbon data reveals finer-scale patterning than is usually possible with summed-probability approaches: the boom-and-bust models of prehistoric populations may, in fact, be a misinterpretation of more subtle demographic changes occurring at the same time as cultural change and attendant differences in the archaeological record.
Resumo:
PCR-based immunoglobulin (Ig)/T-cell receptor (TCR) clonality testing in suspected lymphoproliferations has largely been standardized and has consequently become technically feasible in a routine diagnostic setting. Standardization of the pre-analytical and post-analytical phases is now essential to prevent misinterpretation and incorrect conclusions derived from clonality data. As clonality testing is not a quantitative assay, but rather concerns recognition of molecular patterns, guidelines for reliable interpretation and reporting are mandatory. Here, the EuroClonality (BIOMED-2) consortium summarizes important pre- and post-analytical aspects of clonality testing, provides guidelines for interpretation of clonality testing results, and presents a uniform way to report the results of the Ig/TCR assays. Starting from an immunobiological concept, two levels to report Ig/TCR profiles are discerned: the technical description of individual (multiplex) PCR reactions and the overall molecular conclusion for B and T cells. Collectively, the EuroClonality (BIOMED-2) guidelines and consensus reporting system should help to improve the general performance level of clonality assessment and interpretation, which will directly impact on routine clinical management (standardized best-practice) in patients with suspected lymphoproliferations.
Resumo:
Increased complexity in large design and manufacturing organisations requires improvements at the operations management (OM)–applied service (AS) interface areas to improve project effectiveness. The aim of this paper is explore the role of Lean in improving the longitudinal efficiency of the OM–AS interface within a large aerospace organisation using Lean principles and boundary spanning theory. The methodology was an exploratory longitudinal case approach including exploratory interviews (n = 21), focus groups (n = 2), facilitated action-research workshops (n = 2) and two trials or experiments using longitudinal data involving both OM and AS personnel working at the interface. The findings draw upon Lean principles and boundary spanning theory to guide and interpret the findings. It was found that misinterpretation, and forced implementation, of OM-based Lean terminology and practice in the OM–AS interface space led to delays and misplaced resources. Rather both OM and AS staff were challenged to develop a cross boundary understanding of Lean-based boundary (knowledge) objects in interpreting OM requests. The longitudinal findings from the experiments showed that the development of Lean Performance measurements and lean Value Stream constructs was more successful when these Lean constructs were treated as boundary (knowledge) objects requiring transformation over time to orchestrate improved effectiveness and in leading to consistent terminology and understanding between the OM–AS boundary spanning team.
Resumo:
Abstract: In the mid-1990s when I worked for a telecommunications giant I struggled to gain access to basic geodemographic data. It cost hundreds of thousands of dollars at the time to simply purchase a tile of satellite imagery from Marconi, and it was often cheaper to create my own maps using a digitizer and A0 paper maps. Everything from granular administrative boundaries to right-of-ways to points of interest and geocoding capabilities were either unavailable for the places I was working in throughout Asia or very limited. The control of this data was either in a government’s census and statistical bureau or was created by a handful of forward thinking corporations. Twenty years on we find ourselves inundated with data (location and other) that we are challenged to amalgamate, and much of it still “dirty” in nature. Open data initiatives such as ODI give us great hope for how we might be able to share information together and capitalize not only in the crowdsourcing behavior but in the implications for positive usage for the environment and for the advancement of humanity. We are already gathering and amassing a great deal of data and insight through excellent citizen science participatory projects across the globe. In early 2015, I delivered a keynote at the Data Made Me Do It conference at UC Berkeley, and in the preceding year an invited talk at the inaugural QSymposium. In gathering research for these presentations, I began to ponder on the effect that social machines (in effect, autonomous data collection subjects and objects) might have on social behaviors. I focused on studying the problem of data from various veillance perspectives, with an emphasis on the shortcomings of uberveillance which included the potential for misinformation, misinterpretation, and information manipulation when context was entirely missing. As we build advanced systems that rely almost entirely on social machines, we need to ponder on the risks associated with following a purely technocratic approach where machines devoid of intelligence may one day dictate what humans do at the fundamental praxis level. What might be the fallout of uberveillance? Bio: Dr Katina Michael is a professor in the School of Computing and Information Technology at the University of Wollongong. She presently holds the position of Associate Dean – International in the Faculty of Engineering and Information Sciences. Katina is the IEEE Technology and Society Magazine editor-in-chief, and IEEE Consumer Electronics Magazine senior editor. Since 2008 she has been a board member of the Australian Privacy Foundation, and until recently was the Vice-Chair. Michael researches on the socio-ethical implications of emerging technologies with an emphasis on an all-hazards approach to national security. She has written and edited six books, guest edited numerous special issue journals on themes related to radio-frequency identification (RFID) tags, supply chain management, location-based services, innovation and surveillance/ uberveillance for Proceedings of the IEEE, Computer and IEEE Potentials. Prior to academia, Katina worked for Nortel Networks as a senior network engineer in Asia, and also in information systems for OTIS and Andersen Consulting. She holds cross-disciplinary qualifications in technology and law.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Failure analysis has been, throughout the years, a fundamental tool used in the aerospace sector, supporting assessments performed by sustainment and design engineers mainly related to failure modes and material suitability. The predicted service life of aircrafts often exceeds 40 years, and the design assured life rarely accounts for all in service loads and in service environmental menaces that aging aircrafts must deal with throughout their service lives. From the most conservative safe-life conceptual design approaches to the most recent on-condition based design approaches, assessing the condition and predicting the failure modes of components and materials are essential for the development of adequate preventive and corrective maintenance actions as well as for the accomplishment and optimization of scheduled maintenance programs of aircrafts. Moreover, as the operational conditions of aircrafts may vary significantly from operator to operator (especially in military aircraft), it is necessary to access if the defined maintenance programs are adequate to guarantee the continuous reliability and safe usage of the aircrafts, preventing catastrophic failures which bear significant maintenance and repair costs, and that may lead to the loss of human lives. Thus being, failure analysis and material investigations performed as part of aircraft accidents and incidents investigations arise as powerful tools of the utmost importance for safety assurance and cost reduction within the aeronautical and aerospace sectors. The Portuguese Air Force (PRTAF) has operated different aircrafts throughout its long existence, and in some cases, has operated a particular type of aircraft for more than 30 years, gathering a great amount of expertise in: assessing failure modes of the aircrafts materials; conducting aircrafts accidents and incidents investigations (sometimes with the participation of the aircraft manufacturers and/or other operators); and in the development of design and repair solutions for in-service related problems. This paper addresses several studies to support the thesis that failure analysis plays a key role in flight safety improvement within the PRTAF. It presents a short summary of developed
Resumo:
Le présent mémoire entend analyser la pratique de l’auto-filmage dans deux films : La Pudeur ou l’impudeur (Hervé Guibert) et Tarnation (Jonathan Caouette). Nous regroupons ces long-métrages sous l’étiquette « auto-filmage pathographique ». Le malade, s’émancipant de l’imagerie médicale et des pratiques cinématographiques institutionnelles, reprend l’image filmique à son compte, aidé en cela par une technologie toujours plus ergonomique. Cette résurgence de l’image du corps malade dans le champ social ne se fait pas sans heurt ; l’exposition de corps décharnés et agoniques convoque un imaginaire catastrophiste et contredit les rituels d’effacement du corps auxquels procède la société occidentale. La forme que prend le récit de soi dans l’auto-filmage pathographique dépend de la maladie qui affecte chaque créateur. Nous observons une redéfinition de la sincérité, en lien avec l’exercice autobiographique. Il s’agit d’utiliser, dans l’auto-filmage pathographique, certains procédés fictionnels pour créer un discours sur soi-même dont la véracité repose sur d’autres critères que ceux communément admis. L’auto-filmage pathographique suppose en ce sens un véritable changement d’attitude et la mise en place de techniques de soi. Il induit une forme de réconciliation avec sa propre identité physique et psychique. En cela, l’écriture filmique de soi est un agent transformateur de la vie et un exercice spirituel. Les réalisateurs ne sont cependant pas uniquement tournés vers eux-mêmes. Chacun inclut quelques privilégiés au coeur de sa démarche. Le soin de soi, dans l’auto-filmage pathographique, ne se désolidarise pas du soin des autres. Auto-filmage et caméra subjective entretiennent un lien dialectique qui donne son sens à l’auto-filmage pathographique et voit leur antagonisme éclater. L’individu s’auto-filmant n’est pas seul ; sa démarche n’est pas qu’un solipsisme. Elle se voit dépassée par l’émergence de l’autre dans le champ ou parfois même, sa prise en main de la caméra.
Resumo:
An effective hygiene and sanitation inspection of meat and meat products is essential for its production and commercialization. For this reason, the national and international standards responsible for these products quality control employs microbiological analyses methods as quality control tools. In December of 2012, it was included in the Ministério da Agricultura Pecuária e Abastecimento (MAPA) website, a Microbiological Scope of food and water, which presents the replacement of some methods proposed by the Normative Instruction 62. Some of these methodologies are considered rapid, practical and convenient. However, other methodologies were still replaced by conventional ones, which presents disadvantages as incorrect interpretations of the microorganism phenotypical and biochemical characteristics, leading to the misinterpretation of test results. Therefore, the objective of this study is to develop a comprehensive, practical and illustrative guidebook of microbiological analysis for in natura poultry cuts. The methods addressed in this guide are the official standards analysis required by the poultry cuts legislation, which are the Escherichia coli count, the thermotolerant coliforms count, the aerobic plate count and the detection of Salmonella spp. The approached methodologies for these analysis will be the AOAC 998.08, the Normative Instruction 62 and the ISO 4833-1:2013 and ISO 6579:2002, respectively. In these events, it is expected to obtain an enlightening and approved guidebook evaluated by laboratory technicians, which will help reduce the analytical subjectivity leading to a more reliable interpretation of the test results.
Resumo:
Multiple myeloma (MM) is a plasmocytic malignant proliferation of a single clone resulting in an overabundance of monoclonal immunoglobulins. MM commonly presents with bone disorders, renal failure, anaemia and hypercalcaemia. Hyperviscosity syndrome is rare, as are vaso-occlusive symptoms. The authors report a dramatic case of an 80-year-old woman admitted to the emergency department with full-blown distal gangrene. The culprit turned out to be a MM, unusually presenting with symptomatic hyperviscosity and peripheral occlusive ischaemia. This catastrophic and particularly dramatic presentation is almost unprecedented, with only a few cases reported worldwide.
Resumo:
A key driver of Australian sweetpotato productivity improvements and consumer demand has been industry adoption of disease-free planting material systems. On a farm isolated from main Australian sweetpotato areas, virus-free germplasm is annually multiplied, with subsequent 'pathogen-tested' (PT) sweetpotato roots shipped to commercial Australian sweetpotato growers. They in turn plant their PT roots into specially designated plant beds, commencing in late winter. From these beds, they cut sprouts as the basis for their commercial fields. Along with other intense agronomic practices, this system enables Australian producers to achieve worldRSQUOs highest commercial yields (per hectare) of premium sweetpotatoes. Their industry organisation, ASPG (Australian Sweetpotato Growers Inc.), has identified productivity of mother plant beds as a key driver of crop performance. Growers and scientists are currently collaborating to investigate issues such as catastrophic plant beds losses; optimisation of irrigation and nutrient addition; rapidity and uniformity of initial plant bed harvests; optimal plant bed harvest techniques; virus re-infection of plant beds; and practical longevity of plant beds. A survey of 50 sweetpotato growers in Queensland and New South Wales identified a substantial diversity in current plant bed systems, apparently influenced by growing district, scale of operation, time of planting, and machinery/labour availability. Growers identified key areas for plant bed research as: optimising the size and grading specifications of PT roots supplied for the plant beds; change in sprout density, vigour and performance through sequential cuttings of the plant bed; optimal height above ground level to cut sprouts to maximise commercial crop and plant bed performance; and use of structures and soil amendments in plant bed systems. Our ongoing multi-disciplinary research program integrates detailed agronomic experiments, grower adaptive learning sites, product quality and consumer research, to enhance industry capacity for inspired innovation and commercial, sustainable practice change.
Resumo:
When components of a propulsion system are exposed to elevated flow temperatures there is a risk for catastrophic failure if the components are not properly protected from the thermal loads. Among several strategies, slot film cooling is one of the most commonly used, yet poorly understood active cooling techniques. Tangential injection of a relatively cool fluid layer protects the surface(s) in question, but the turbulent mixing between the hot mainstream and cooler film along with the presence of the wall presents an inherently complex problem where kinematics, thermal transport and multimodal heat transfer are coupled. Furthermore, new propulsion designs rely heavily on CFD analysis to verify their viability. These CFD models require validation of their results, and the current literature does not provide a comprehensive data set for film cooling that meets all the demands for proper validation, namely a comprehensive (kinematic, thermal and boundary condition data) data set obtained over a wide range of conditions. This body of work aims at solving the fundamental issue of validation by providing high quality comprehensive film cooling data (kinematics, thermal mixing, heat transfer). 3 distinct velocity ratios (VR=uc/u∞) are examined corresponding to wall-wake (VR~0.5), min-shear (VR ~ 1.0), and wall-jet (VR~2.0) type flows at injection, while the temperature ratio TR= T∞/Tc is approximately 1.5 for all cases. Turbulence intensities at injection are 2-4% for the mainstream (urms/u∞, vrms/u∞,), and on the order of 8-10% for the coolant (urms/uc, vrms/uc,). A special emphasis is placed on inlet characterization, since inlet data in the literature is often incomplete or is of relatively low quality for CFD development. The data reveals that min-shear injection provides the best performance, followed by the wall-jet. The wall-wake case is comparably poor in performance. The comprehensive data suggests that this relative performance is due to the mixing strength of each case, as well as the location of regions of strong mixing with respect to the wall. Kinematic and thermal data show that strong mixing occurs in the wall-jet away from the wall (y/s>1), while strong mixing in the wall-wake occurs much closer to the wall (y/s<1). Min-shear cases exhibit noticeably weaker mixing confined to about y/s=1. Additionally to these general observations, the experimental data obtained in this work is analyzed to reveal scaling laws for the inlets, near-wall scaling, detecting and characterizing coherent structures in the flow as well as to provide data reduction strategies for comparison to CFD models (RANS and LES).