300 resultados para SLOW


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2006, Sir Edmund Hillary lambasted the modern climbing fraternity for abandoning other climbers to a slow frozen death on Everest, claiming that in his day they would never leave someone to die. This followed the controversial death of David Sharp, passed by an estimated 40 climbers who were more interested in the summit than the life of a fellow human being. But was this stinging criticism true or just the faded recollections of a former climbing giant? This book investigates that claim through a narrative analysis, which combines the empirical analysis of Hawley and Salisbury's Himalayan Expedition Database with the anecdotal evidence provided by a plethora of newspaper articles and books. While there is evidence supporting the claim that commercialization is to blame for the breakdown of pro-social behaviour, the results cannot conclude if it is the commercial climber or the operator driving the problem and that the Sherpa are the saving grace.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It’s the stuff of nightmares: your intimate images are leaked and posted online by somebody you thought you could trust. But in Australia, victims often have no real legal remedy for this kind of abuse. This is the key problem of regulating the internet. Often, speech we might consider abusive or offensive isn’t actually illegal. And even when the law technically prohibits something, enforcing it directly against offenders can be difficult. It is a slow and expensive process, and where the offender or the content is overseas, there is virtually nothing victims can do. Ultimately, punishing intermediaries for content posted by third parties isn’t helpful. But we do need to have a meaningful conversation about how we want our shared online spaces to feel. The providers of these spaces have a moral, if not legal, obligation to facilitate this conversation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper asks the question to what scale and speed does society need to reduce its ecological footprint and improve resource productivity to prevent further overshoot and return within the ecological limits of the earth’s ecological life support systems? How fast do these changes need to be achieved? The paper shows that now a large range of studies find that engineering sustainable solutions need to be roughly an order or magnitude resource productivity improvement (sometimes called a Factor of 10, or a 90% reduction) by 2050 to achieve real and lasting ecological sustainability. This marks a significant challenge for engineers – indeed all designers and architects, where best practice in engineering sustainable solutions will need to achieve large resource productivity targets. The paper brings together examples of best practice in achieving these large targets from around the world. The paper also highlights key resources and texts for engineers who wish to learn how to do it. But engineers need to be realistic and patient. Significant barriers exist to achieving Factor 4-10 such as the fact that infrastructure and technology rollover and replacement is often slow. This slow rollover of the built environment and technology is the context within which most engineers work, making the goal of achieving Factor 10 all the more challenging. However, the paper demonstrates that by using best practice in engineering sustainable solutions and by addressing the necessary market, information and institutional failures it is possible to achieve Factor 10 over the next 50 years. This paper draws on recent publications by The Natural Edge Project (TNEP) and partners, including Hargroves, K. Smith, M. (Eds) (2005) The Natural Advantage of Nations: Business Opportunities, Innovation and Governance for the 21st Century, and the TNEP Engineering Sustainable Solutions Program - Critical Literacies for Engineers Portfolio. Both projects have the significant support of Engineers Australia. its College of Environmental Engineers and the Society of Sustainability and Environmental Engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On the 19 November 2014, seven Harvard students — the Harvard Climate Justice Coalition — have brought a legal action against Harvard University to compel it to withdraw its investments from fossil fuel companies. The plaintiffs include the Harvard Climate Justice Coalition; Alice Cherry, a law student; Benjamin Franta, a physics student interested in renewable energy; Sidni Frederick, a student of history and literature; Joseph Hamilton, a law student; Olivia Kivel, a biologist interested in sustainable farming; Talia Rothstein, a student of history and literature; and Kelsey Skaggs, a law student from Alaska interested in climate justice. The Harvard Climate Justice Coalition also bringing the lawsuit as ‘next friend of Plaintiffs Future Generations, individuals not yet born or too young to assert their rights but whose future health, safety, and welfare depends on current efforts to slow the pace of climate change.’ The case of Harvard Climate Justice Coalition v. President and Fellows of Harvard College, is being heard in the Suffolk County Superior Court of Massachusetts. The dispute will be an important precedent on the ongoing policy and legal battles in respect of climate change, education, and fossil fuel divestment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effectively capturing opportunities requires rapid decision-making. We investigate the speed of opportunity evaluation decisions by focusing on firms' venture termination and venture advancement decisions. Experience, standard operating procedures, and confidence allow firms to make opportunity evaluation decisions faster; we propose that a firm's attentional orientation, as reflected in its project portfolio, limits the number of domains in which these speed-enhancing mechanisms can be developed. Hence firms' decision speed is likely to vary between different types of decisions. Using unique data on 3,269 mineral exploration ventures in the Australian mining industry, we find that firms with a higher degree of attention toward earlier-stage exploration activities are quicker to abandon potential opportunities in early development but slower to do so later, and that such firms are also slower to advance on potential opportunities at all stages compared to firms that focus their attention differently. Market dynamism moderates these relationships, but only with regard to initial evaluation decisions. Our study extends research on decision speed by showing that firms are not necessarily fast or slow regarding all the decisions they make, and by offering an opportunity evaluation framework that recognizes that decision makers can, in fact often do, pursue multiple potential opportunities simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spoken word production is assumed to involve stages of processing in which activation spreads through layers of units comprising lexical-conceptual knowledge and their corresponding phonological word forms. Using high-field (4T) functional magnetic resonance imagine (fMRI), we assessed whether the relationship between these stages is strictly serial or involves cascaded-interactive processing, and whether central (decision/control) processing mechanisms are involved in lexical selection. Participants performed the competitor priming paradigm in which distractor words, named from a definition and semantically related to a subsequently presented target picture, slow picture-naming latency compared to that with unrelated words. The paradigm intersperses two trials between the definition and the picture to be named, temporally separating activation in the word perception and production networks. Priming semantic competitors of target picture names significantly increased activation in the left posterior temporal cortex, and to a lesser extent the left middle temporal cortex, consistent with the predictions of cascaded-interactive models of lexical access. In addition, extensive activation was detected in the anterior cingulate and pars orbitalis of the inferior frontal gyrus. The findings indicate that lexical selection during competitor priming is biased by top-down mechanisms to reverse associations between primed distractor words and target pictures to select words that meet the current goal of speech.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes algorithms that can identify patterns of brain structure and function associated with Alzheimer's disease, schizophrenia, normal aging, and abnormal brain development based on imaging data collected in large human populations. Extraordinary information can be discovered with these techniques: dynamic brain maps reveal how the brain grows in childhood, how it changes in disease, and how it responds to medication. Genetic brain maps can reveal genetic influences on brain structure, shedding light on the nature-nurture debate, and the mechanisms underlying inherited neurobehavioral disorders. Recently, we created time-lapse movies of brain structure for a variety of diseases. These identify complex, shifting patterns of brain structural deficits, revealing where, and at what rate, the path of brain deterioration in illness deviates from normal. Statistical criteria can then identify situations in which these changes are abnormally accelerated, or when medication or other interventions slow them. In this paper, we focus on describing our approaches to map structural changes in the cortex. These methods have already been used to reveal the profile of brain anomalies in studies of dementia, epilepsy, depression, childhood- and adult-onset schizophrenia, bipolar disorder, attention-deficit/hyperactivity disorder, fetal alcohol syndrome, Tourette syndrome, Williams syndrome, and in methamphetamine abusers. Specifically, we describe an image analysis pipeline known as cortical pattern matching that helps compare and pool cortical data over time and across subjects. Statistics are then defined to identify brain structural differences between groups, including localized alterations in cortical thickness, gray matter density (GMD), and asymmetries in cortical organization. Subtle features, not seen in individual brain scans, often emerge when population-based brain data are averaged in this way. Illustrative examples are presented to show the profound effects of development and various diseases on the human cortex. Dynamically spreading waves of gray matter loss are tracked in dementia and schizophrenia, and these sequences are related to normally occurring changes in healthy subjects of various ages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerobic respiration is a fundamental energy-generating process; however, there is cost associated with living in an oxygen-rich environment, because partially reduced oxygen species can damage cellular components. Organisms evolved enzymes that alleviate this damage and protect the intracellular milieu, most notably thiol peroxidases, which are abundant and conserved enzymes that mediate hydrogen peroxide signaling and act as the first line of defense against oxidants in nearly all living organisms. Deletion of all eight thiol peroxidase genes in yeast (∆8 strain) is not lethal, but results in slow growth and a high mutation rate. Here we characterized mechanisms that allow yeast cells to survive under conditions of thiol peroxidase deficiency. Two independent ∆8 strains increased mitochondrial content, altered mitochondrial distribution, and became dependent on respiration for growth but they were not hypersensitive to H2O2. In addition, both strains independently acquired a second copy of chromosome XI and increased expression of genes encoded by it. Survival of ∆8 cells was dependent on mitochondrial cytochrome-c peroxidase (CCP1) and UTH1, present on chromosome XI. Coexpression of these genes in ∆8 cells led to the elimination of the extra copy of chromosome XI and improved cell growth, whereas deletion of either gene was lethal. Thus, thiol peroxidase deficiency requires dosage compensation of CCP1 and UTH1 via chromosome XI aneuploidy, wherein these proteins support hydroperoxide removal with the reducing equivalents generated by the electron transport chain. To our knowledge, this is the first evidence of adaptive aneuploidy counteracting oxidative stress.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DURBAN CLIMATE CHANGE CONFERENCE: In a global day of action for climate justice, thousands of protestors complained about the slow progress in international debates on climate change at the United Nations conference in Durban. One of the chants of the campaigners was “Climate justice … not climate apartheid”. Banners dubbed the Durban event a “circus” – a “conference of polluters”.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Nurses and midwives must be able to adapt their behaviour and language to meet the health care needs of patients and their families in diverse and at times difficult circumstances. Methods This study of fourth year dual degree nurse midwives use Communication Accommodation Theory strategies to examine their use of language and discourse when managing a sequential simulation of neonatal resuscitation and bereavement support. Results The results showed that many of the students were slow to respond to the changing needs of the patient and family and at times used ineffectual and disengaging language. Conclusion Clinical simulation is a safe and effective method for nurses and midwives to experience and practice the use of language and discourse in challenging circumstances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Despite the critical role of immunoglobulin E (IgE) in allergy, circulating IgE+ B cells are scarce. Here, we describe in patients with allergic rhinitis B cells with a memory phenotype responding to a prototypic aeroallergen. Methods Fifteen allergic rhinitis patients with grass pollen allergy and 13 control subjects were examined. Blood mononuclear cells stained with carboxyfluorescein diacetate succinimidyl ester (CFSE) were cultured with Bahia grass pollen. Proliferation and phenotype were assessed by multicolour flow cytometry. Results In blood of allergic rhinitis patients with high serum IgE to grass pollen, most IgEhi cells were CD123+ HLA-DR- basophils, with IgE for the major pollen allergen (Pas n 1). Both B and T cells from pollen-allergic donors showed higher proliferation to grass pollen than nonallergic donors (P = 0.002, and 0.010, respectively), whereas responses to vaccine antigens and mitogen did not differ between groups. Allergen-driven B cells that divided rapidly (CD19mid CD3- CFSElo) showed higher CD27 (P = 0.008) and lower CD19 (P = 0.004) and CD20 (P = 0.004) expression than B cells that were slow to respond to allergen (CD19hi CD3- CFSEmid). Moreover, rapidly dividing allergen-driven B cells (CD19mid CFSElo CD27hi) showed higher expression of the plasmablast marker CD38 compared with B cells (CD19hi CFSEmid CD27lo) that were slow to divide. Conclusion Patients with pollen allergy but not control donors have a population of circulating allergen-specific B cells with the phenotype and functional properties of adaptive memory B-cell responses. These cells could provide precursors for allergen-specific IgE production upon allergen re-exposure. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.