919 resultados para slow freezing
Resumo:
In 2006, Sir Edmund Hillary lambasted the modern climbing fraternity for abandoning other climbers to a slow frozen death on Everest, claiming that in his day they would never leave someone to die. This followed the controversial death of David Sharp, passed by an estimated 40 climbers who were more interested in the summit than the life of a fellow human being. But was this stinging criticism true or just the faded recollections of a former climbing giant? This book investigates that claim through a narrative analysis, which combines the empirical analysis of Hawley and Salisbury's Himalayan Expedition Database with the anecdotal evidence provided by a plethora of newspaper articles and books. While there is evidence supporting the claim that commercialization is to blame for the breakdown of pro-social behaviour, the results cannot conclude if it is the commercial climber or the operator driving the problem and that the Sherpa are the saving grace.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
It’s the stuff of nightmares: your intimate images are leaked and posted online by somebody you thought you could trust. But in Australia, victims often have no real legal remedy for this kind of abuse. This is the key problem of regulating the internet. Often, speech we might consider abusive or offensive isn’t actually illegal. And even when the law technically prohibits something, enforcing it directly against offenders can be difficult. It is a slow and expensive process, and where the offender or the content is overseas, there is virtually nothing victims can do. Ultimately, punishing intermediaries for content posted by third parties isn’t helpful. But we do need to have a meaningful conversation about how we want our shared online spaces to feel. The providers of these spaces have a moral, if not legal, obligation to facilitate this conversation.
Resumo:
This paper asks the question to what scale and speed does society need to reduce its ecological footprint and improve resource productivity to prevent further overshoot and return within the ecological limits of the earth’s ecological life support systems? How fast do these changes need to be achieved? The paper shows that now a large range of studies find that engineering sustainable solutions need to be roughly an order or magnitude resource productivity improvement (sometimes called a Factor of 10, or a 90% reduction) by 2050 to achieve real and lasting ecological sustainability. This marks a significant challenge for engineers – indeed all designers and architects, where best practice in engineering sustainable solutions will need to achieve large resource productivity targets. The paper brings together examples of best practice in achieving these large targets from around the world. The paper also highlights key resources and texts for engineers who wish to learn how to do it. But engineers need to be realistic and patient. Significant barriers exist to achieving Factor 4-10 such as the fact that infrastructure and technology rollover and replacement is often slow. This slow rollover of the built environment and technology is the context within which most engineers work, making the goal of achieving Factor 10 all the more challenging. However, the paper demonstrates that by using best practice in engineering sustainable solutions and by addressing the necessary market, information and institutional failures it is possible to achieve Factor 10 over the next 50 years. This paper draws on recent publications by The Natural Edge Project (TNEP) and partners, including Hargroves, K. Smith, M. (Eds) (2005) The Natural Advantage of Nations: Business Opportunities, Innovation and Governance for the 21st Century, and the TNEP Engineering Sustainable Solutions Program - Critical Literacies for Engineers Portfolio. Both projects have the significant support of Engineers Australia. its College of Environmental Engineers and the Society of Sustainability and Environmental Engineering.
Resumo:
On the 19 November 2014, seven Harvard students — the Harvard Climate Justice Coalition — have brought a legal action against Harvard University to compel it to withdraw its investments from fossil fuel companies. The plaintiffs include the Harvard Climate Justice Coalition; Alice Cherry, a law student; Benjamin Franta, a physics student interested in renewable energy; Sidni Frederick, a student of history and literature; Joseph Hamilton, a law student; Olivia Kivel, a biologist interested in sustainable farming; Talia Rothstein, a student of history and literature; and Kelsey Skaggs, a law student from Alaska interested in climate justice. The Harvard Climate Justice Coalition also bringing the lawsuit as ‘next friend of Plaintiffs Future Generations, individuals not yet born or too young to assert their rights but whose future health, safety, and welfare depends on current efforts to slow the pace of climate change.’ The case of Harvard Climate Justice Coalition v. President and Fellows of Harvard College, is being heard in the Suffolk County Superior Court of Massachusetts. The dispute will be an important precedent on the ongoing policy and legal battles in respect of climate change, education, and fossil fuel divestment.
Resumo:
Effectively capturing opportunities requires rapid decision-making. We investigate the speed of opportunity evaluation decisions by focusing on firms' venture termination and venture advancement decisions. Experience, standard operating procedures, and confidence allow firms to make opportunity evaluation decisions faster; we propose that a firm's attentional orientation, as reflected in its project portfolio, limits the number of domains in which these speed-enhancing mechanisms can be developed. Hence firms' decision speed is likely to vary between different types of decisions. Using unique data on 3,269 mineral exploration ventures in the Australian mining industry, we find that firms with a higher degree of attention toward earlier-stage exploration activities are quicker to abandon potential opportunities in early development but slower to do so later, and that such firms are also slower to advance on potential opportunities at all stages compared to firms that focus their attention differently. Market dynamism moderates these relationships, but only with regard to initial evaluation decisions. Our study extends research on decision speed by showing that firms are not necessarily fast or slow regarding all the decisions they make, and by offering an opportunity evaluation framework that recognizes that decision makers can, in fact often do, pursue multiple potential opportunities simultaneously.
Resumo:
Spoken word production is assumed to involve stages of processing in which activation spreads through layers of units comprising lexical-conceptual knowledge and their corresponding phonological word forms. Using high-field (4T) functional magnetic resonance imagine (fMRI), we assessed whether the relationship between these stages is strictly serial or involves cascaded-interactive processing, and whether central (decision/control) processing mechanisms are involved in lexical selection. Participants performed the competitor priming paradigm in which distractor words, named from a definition and semantically related to a subsequently presented target picture, slow picture-naming latency compared to that with unrelated words. The paradigm intersperses two trials between the definition and the picture to be named, temporally separating activation in the word perception and production networks. Priming semantic competitors of target picture names significantly increased activation in the left posterior temporal cortex, and to a lesser extent the left middle temporal cortex, consistent with the predictions of cascaded-interactive models of lexical access. In addition, extensive activation was detected in the anterior cingulate and pars orbitalis of the inferior frontal gyrus. The findings indicate that lexical selection during competitor priming is biased by top-down mechanisms to reverse associations between primed distractor words and target pictures to select words that meet the current goal of speech.
Resumo:
This paper describes algorithms that can identify patterns of brain structure and function associated with Alzheimer's disease, schizophrenia, normal aging, and abnormal brain development based on imaging data collected in large human populations. Extraordinary information can be discovered with these techniques: dynamic brain maps reveal how the brain grows in childhood, how it changes in disease, and how it responds to medication. Genetic brain maps can reveal genetic influences on brain structure, shedding light on the nature-nurture debate, and the mechanisms underlying inherited neurobehavioral disorders. Recently, we created time-lapse movies of brain structure for a variety of diseases. These identify complex, shifting patterns of brain structural deficits, revealing where, and at what rate, the path of brain deterioration in illness deviates from normal. Statistical criteria can then identify situations in which these changes are abnormally accelerated, or when medication or other interventions slow them. In this paper, we focus on describing our approaches to map structural changes in the cortex. These methods have already been used to reveal the profile of brain anomalies in studies of dementia, epilepsy, depression, childhood- and adult-onset schizophrenia, bipolar disorder, attention-deficit/hyperactivity disorder, fetal alcohol syndrome, Tourette syndrome, Williams syndrome, and in methamphetamine abusers. Specifically, we describe an image analysis pipeline known as cortical pattern matching that helps compare and pool cortical data over time and across subjects. Statistics are then defined to identify brain structural differences between groups, including localized alterations in cortical thickness, gray matter density (GMD), and asymmetries in cortical organization. Subtle features, not seen in individual brain scans, often emerge when population-based brain data are averaged in this way. Illustrative examples are presented to show the profound effects of development and various diseases on the human cortex. Dynamically spreading waves of gray matter loss are tracked in dementia and schizophrenia, and these sequences are related to normally occurring changes in healthy subjects of various ages.
Resumo:
Aerobic respiration is a fundamental energy-generating process; however, there is cost associated with living in an oxygen-rich environment, because partially reduced oxygen species can damage cellular components. Organisms evolved enzymes that alleviate this damage and protect the intracellular milieu, most notably thiol peroxidases, which are abundant and conserved enzymes that mediate hydrogen peroxide signaling and act as the first line of defense against oxidants in nearly all living organisms. Deletion of all eight thiol peroxidase genes in yeast (∆8 strain) is not lethal, but results in slow growth and a high mutation rate. Here we characterized mechanisms that allow yeast cells to survive under conditions of thiol peroxidase deficiency. Two independent ∆8 strains increased mitochondrial content, altered mitochondrial distribution, and became dependent on respiration for growth but they were not hypersensitive to H2O2. In addition, both strains independently acquired a second copy of chromosome XI and increased expression of genes encoded by it. Survival of ∆8 cells was dependent on mitochondrial cytochrome-c peroxidase (CCP1) and UTH1, present on chromosome XI. Coexpression of these genes in ∆8 cells led to the elimination of the extra copy of chromosome XI and improved cell growth, whereas deletion of either gene was lethal. Thus, thiol peroxidase deficiency requires dosage compensation of CCP1 and UTH1 via chromosome XI aneuploidy, wherein these proteins support hydroperoxide removal with the reducing equivalents generated by the electron transport chain. To our knowledge, this is the first evidence of adaptive aneuploidy counteracting oxidative stress.
Resumo:
DURBAN CLIMATE CHANGE CONFERENCE: In a global day of action for climate justice, thousands of protestors complained about the slow progress in international debates on climate change at the United Nations conference in Durban. One of the chants of the campaigners was “Climate justice … not climate apartheid”. Banners dubbed the Durban event a “circus” – a “conference of polluters”.
Resumo:
Background Nurses and midwives must be able to adapt their behaviour and language to meet the health care needs of patients and their families in diverse and at times difficult circumstances. Methods This study of fourth year dual degree nurse midwives use Communication Accommodation Theory strategies to examine their use of language and discourse when managing a sequential simulation of neonatal resuscitation and bereavement support. Results The results showed that many of the students were slow to respond to the changing needs of the patient and family and at times used ineffectual and disengaging language. Conclusion Clinical simulation is a safe and effective method for nurses and midwives to experience and practice the use of language and discourse in challenging circumstances.
Resumo:
Background Despite the critical role of immunoglobulin E (IgE) in allergy, circulating IgE+ B cells are scarce. Here, we describe in patients with allergic rhinitis B cells with a memory phenotype responding to a prototypic aeroallergen. Methods Fifteen allergic rhinitis patients with grass pollen allergy and 13 control subjects were examined. Blood mononuclear cells stained with carboxyfluorescein diacetate succinimidyl ester (CFSE) were cultured with Bahia grass pollen. Proliferation and phenotype were assessed by multicolour flow cytometry. Results In blood of allergic rhinitis patients with high serum IgE to grass pollen, most IgEhi cells were CD123+ HLA-DR- basophils, with IgE for the major pollen allergen (Pas n 1). Both B and T cells from pollen-allergic donors showed higher proliferation to grass pollen than nonallergic donors (P = 0.002, and 0.010, respectively), whereas responses to vaccine antigens and mitogen did not differ between groups. Allergen-driven B cells that divided rapidly (CD19mid CD3- CFSElo) showed higher CD27 (P = 0.008) and lower CD19 (P = 0.004) and CD20 (P = 0.004) expression than B cells that were slow to respond to allergen (CD19hi CD3- CFSEmid). Moreover, rapidly dividing allergen-driven B cells (CD19mid CFSElo CD27hi) showed higher expression of the plasmablast marker CD38 compared with B cells (CD19hi CFSEmid CD27lo) that were slow to divide. Conclusion Patients with pollen allergy but not control donors have a population of circulating allergen-specific B cells with the phenotype and functional properties of adaptive memory B-cell responses. These cells could provide precursors for allergen-specific IgE production upon allergen re-exposure. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Resumo:
Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.
Resumo:
Cost estimating is a key task within Quantity Surveyors’ (QS) offices. Provision of an accurate estimate is vital to ensure that the objectives of the client are met by staying within the client’s budget. Building Information Modelling (BIM) is an evolving technology that has gained attention in the construction industries all over the world. Benefits from the use of BIM include cost and time savings if the processes used by the procurement team are adapted to maximise the benefits of BIM. BIM can be used by QSs to automate aspects of quantity take-off and the preparation of estimates, decreasing turnaround time and assist in controlling errors and inaccuracies. The Malaysian government has decided to require the use of BIM for its projects beginning from 2016. However, slow uptake is reported in the use of BIM both within companies and to support collaboration within the Malaysian industry. It has been recommended that QSs to start evaluating the impact of BIM on their practices. This paper reviews the perspectives of QSs in Malaysia towards the use of BIM to achieve more dependable results in their cost estimating practice. The objectives of this paper include identifying strategies in improving practice and potential adoption drivers that lead QSs to BIM usage in their construction projects. From the expert interviews, it was found out that, despite still using traditional methods and not practising BIM, the interviewees still acquire limited knowledge related to BIM. There are some drivers that potentially motivate them to employ BIM in their practices. These include client demands, innovation in traditional methods, speed in estimating costs, reduced time and costs, improvement in practices and self-awareness, efficiency in projects, and competition from other companies. The findings of this paper identify the potential drivers in encouraging Malaysian Quantity Surveyors to exploit BIM in their construction projects.
Resumo:
Introduction: Agonists of glucagon-like peptide-1 (GLP-1) receptors are used in the treatment of type 2 diabetes. Albiglutide is a new long acting GLP-1 receptor agonist being developed for once-weekly use. Areas covered: This evaluation is of 2 clinical trials in the HARMONY clinical trials series. HARMONY 3 compares albiglutide to sitagliptin and glimepiride in subjects with type 2 diabetes poorly controlled with metformin, and HARMONY 6 compares albiglutide to insulin lispro in subjects poorly controlled with slow/medium release preparations of insulin. Expert opinion: Both studies showed that albiglutide lowered HbA1c, and had advantages over its comparator drugs. However, questions remain about the safety of albiglutide. Albiglutide is not being used in subjects with a history of thyroid cancer, as it is not known whether this is a rare adverse effect with albiglutide. Also, the safety of albiglutide in subjects with type 2 diabetes and high cardiovascular risk is unknown.