104 resultados para random walk and efficiency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of solar thermal power is dependent upon the amount of sunlight exposure,as influenced by the day-night cycle and seasonal variations. In this paper, robust optimisation is applied to the design of a power block and turbine, which is generating 30 MWe from a concentrated solar resource of 560oC. The robust approach is important to attain a high average performance (minimum efficiency change) over the expected operating ranges of temperature, speed and mass flow. The final objective function combines the turbine performance and efficiency weighted by the off-design performance. The resulting robust optimisation methodology as presented in the paper gives further information that greatly aids in the design of non-classical power blocks through considering off-design conditions and resultant performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light emitting field effect transistors (LEFETs) are emerging as a multi-functional class of optoelectronic devices. LEFETs can simultaneously execute light emission and the standard logic functions of a transistor in a single architecture. However, current LEFET architectures deliver either high brightness or high efficiency but not both concurrently, thus limiting their use in technological applications. Here we show an LEFET device strategy that simultaneously improves brightness and efficiency. The key step change in LEFET performance arises from the bottom gate top-contact device architecture in which the source/drain electrodes are semitransparent and the active channel contains a bi-layer comprising of a high mobility charge-transporting polymer, and a yellow-green emissive polymer. A record external quantum efficiency (EQE) of 2.1% at 1000cd/m2 is demonstrated for polymer based bilayer LEFETs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collective cell spreading is frequently observed in development, tissue repair and disease progression. Mathematical modelling used in conjunction with experimental investigation can provide key insights into the mechanisms driving the spread of cell populations. In this study, we investigated how experimental and modelling frameworks can be used to identify several key features underlying collective cell spreading. In particular, we were able to independently quantify the roles of cell motility and cell proliferation in a spreading cell population, and investigate how these roles are influenced by factors such as the initial cell density, type of cell population and the assay geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lateralization of temporal lobe epilepsy (TLE) is critical for successful outcome of surgery to relieve seizures. TLE affects brain regions beyond the temporal lobes and has been associated with aberrant brain networks, based on evidence from functional magnetic resonance imaging. We present here a machine learning-based method for determining the laterality of TLE, using features extracted from resting-state functional connectivity of the brain. A comprehensive feature space was constructed to include network properties within local brain regions, between brain regions, and across the whole network. Feature selection was performed based on random forest and a support vector machine was employed to train a linear model to predict the laterality of TLE on unseen patients. A leave-one-patient-out cross validation was carried out on 12 patients and a prediction accuracy of 83% was achieved. The importance of selected features was analyzed to demonstrate the contribution of resting-state connectivity attributes at voxel, region, and network levels to TLE lateralization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Autonomous vehicles are able to share information about the local traffic state in real time, which could result in a better reaction to the mechanism of traffic jam formation. An upstream single-hop radio broadcast network can improve the perception of each cooperative driver within a specific radio range and hence the traffic stability. The impact of vehicle to vehicle cooperation on the onset of traffic congestion is investigated analytically and through simulation. A next generation simulation field dataset is used to calibrate the full velocity difference car-following model, and the MOBIL lane-changing model is implemented. The robustness of the calibration as well as the heterogeneity of the drivers is discussed. Assuming that congestion can be triggered either by the heterogeneity of drivers' behaviours or abnormal lane-changing behaviours, the calibrated car-following model is used to assess the impact of a microscopic cooperative law on egoistic lane-changing behaviours. The cooperative law can help reduce and delay traffic congestion and can have a positive effect on safety indicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As more raw sugar factories become involved in the manufacture of by-products and cogeneration, bagasse is becoming an increasingly valuable commodity. However, in most factories, most of the bagasse produced is used to generate steam in relatively old and inefficient boilers. Efficient bagasse fired boilers are a high capital cost item and the cost of supplying the steam required to run a sugar factory by other means is prohibitive. For many factories a more realistic way to reduce bagasse consumption is to increase the efficiency of existing boilers. The Farleigh No. 3 boiler is a relatively old low efficiency boiler. Like many in the industry, the performance of this boiler has been adversely affected by uneven gas and air flow distributions and air heater leaks. The combustion performance and efficiency of this boiler have been significantly improved by making the gas and air flow distributions through the boiler more uniform and repairing the air heater. The estimated bagasse savings easily justify the cost of the boiler improvements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study seeks to understand the prevailing status of Nepalese media portrayal of natural disasters and develop a disaster management framework to improve the effectiveness and efficiency of news production through the continuum of prevention, preparedness, response and recovery (PPRR) phases of disaster management. The study is currently under progress. It is being undertaken in three phases. In phase-1, a qualitative content analysis is conducted. The news contents are categorized in frames as proposed in the 'Framing theory' and pre-defined frames. However, researcher has looked at the theories of the Press, linking to social responsibility theory as it is regarded as the major obligation of the media towards the society. Thereafter, the contents are categorized as per PPRR cycle. In Phase-2, based on the findings of content analysis, 12 in-depth interviews with journalists, disaster managers and community leaders are conducted. In phase-3, based on the findings of content analysis and in-depth interviews, a framework for effective media management of disaster are developed using thematic analysis. As the study is currently under progress hence, findings from the pilot study are elucidated. The response phase of disasters is most commonly reported in Nepal. There is relatively low coverage of preparedness and prevention. Furthermore, the responsibility frame in the news is most prevalent following human interest. Economic consequences and conflict frames are also used while reporting and vulnerability assessment has been used as an additional frame. The outcomes of this study are multifaceted: At the micro-level people will be benefited as it will enable a reduction in the loss of human lives and property through effective dissemination of information in news and other mode of media. They will be ‘well prepared for', 'able to prevent', 'respond to' and 'recover from' any natural disasters. At the meso level the media industry will be benefited and have their own 'disaster management model of news production' as an effective disaster reporting tool which will improve in media's editorial judgment and priority. At the macro-level it will assist government and other agencies to develop appropriate policies and strategies for better management of natural disasters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The photocatalytic ability of cubic Bi1.5ZnNb1.5O7 (BZN) pyrochlore for the decolorization of an acid orange 7 (AO7) azo dye in aqueous solution under ultraviolet (UV) irradiation has been investigated for the first time. BZN catalyst powders prepared using low temperature sol-gel and higher temperature solid-state methods have been evaluated and their reaction rates have been compared.The experimental band gap energy has been estimated from the optical absorption edge and has been used as reference for theoretical calculations. The electronic band structure of BZN has been investigated using first-principles density functional theory (DFT) calculations for random, completely and partially ordered solid solutions of Zn cations in both the A and B sites of the pyrochlore structure.The nature of the orbitals in the valence band (VB) and the conduction band (CB) has been identified and the theoretical band gap energy has been discussed in terms of the DFT model approximations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diffusive transport is a universal phenomenon, throughout both biological and physical sciences, and models of diffusion are routinely used to interrogate diffusion-driven processes. However, most models neglect to take into account the role of volume exclusion, which can significantly alter diffusive transport, particularly within biological systems where the diffusing particles might occupy a significant fraction of the available space. In this work we use a random walk approach to provide a means to reconcile models that incorporate crowding effects on different spatial scales. Our work demonstrates that coarse-grained models incorporating simplified descriptions of excluded volume can be used in many circumstances, but that care must be taken in pushing the coarse-graining process too far.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper identifies two narratives of the Anthropocene and explores how they play out in the realm of future-looking fashion production. Each narrative draws on mythic comparisons to gods and monsters to express humanity’s dilemmas, albeit from different perspectives. The first is a Malthusian narrative of collapse and scarcity, brought about by the monstrous, unstoppable nature of human technology set loose on the natural world. In this vein, philosopher Slavoj Zizek (2010) draws on Biblical analogies, likening ecological crisis to one of the four horsemen of the apocalypse. To find a myth to suit the present times, novelist A.S Byatt (2011) proposes Ragnarök, a Norse myth in which the gods destroy themselves. In contrast, the second narrative is one of technological cornucopia. Stewart Brand (2009, 27), self-described ‘eco-pragmatist’ writes, ‘we are as gods and we have to get good at it’. In his view, human technologies offer the only hope to mitigating the problems caused by human technology – Brand suggests harnessing nuclear power, bioengineering of crops and the geoengineering of the planet as the way forward. Similarly, the French philosopher Bruno Latour (2012, 274), exhorts us to “love our monsters”, likening our technologies to Doctor Frankenstein’s monster – set loose upon the world, and then reviled by his creator. For both Brand and Latour, human technology may be monstrous, but it must also be turned toward solutions. Within this schema, hopeful visions of the future of fashion are similarly divided. In the techno-enabled cornucopian future, the fashion industry embraces wearable technology, speed and efficiency. Technologies such as waterless dyeing, 3D printing and self-cleaning garments shift fashion into a new era of cleaner production. Meanwhile, in the narrative of scarcity, a more cautious approach sees fashion return to a new localism and valuing of the hand-made in a time of shrinking resources. Through discussion of future-looking fashion designers, brands, and activists, this paper explores how they may align along a spectrum to one of these two grand narratives of the future. The paper will discuss how these narratives may unconsciously shape the perspective of both producers and users around the fashion of today and the fashion of tomorrow. This paper poses the question: what stories can be written for fashion’s future in the Anthropocene, and are they fated, or can they be re-written?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-agent systems implicate a high degree of concurrency at both the Inter- and Intra-Agent levels. Scalable, fault tolerant, Agent Grooming Environment (SAGE), the second generation, FIPA compliant MAS requires a built in mechanism to achieve both the Inter- and Intra-Agent concurrency. This paper dilates upon an attempt to provide a reliable, efficient and light-weight solution to provide intra-agent concurrency with-in the internal agent architecture of SAGE. It addresses the issues related to using the JAVA threading model to provide this level of concurrency to the agent and provides an alternative approach that is based on an eventdriven, concurrent and user-scalable multi-tasking model for the agent's internal model. The findings of this paper show that our proposed approach is suitable for providing an efficient and lightweight concurrent task model for SA GE and considerably outweighs the performance of multithreaded tasking model based on JAVA in terms of throughput and efficiency. This has been illustrated using the practical implementation and evaluation of both models. © 2004 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historically, school leaders have occupied a somewhat ambiguous position within networks of power. On the one hand, they appear to be celebrated as what Ball (2003) has termed the ‘new hero of educational reform'; on the other, they are often ‘held to account’ through those same performative processes and technologies. These have become compelling in schools and principals are ‘doubly bound’ through this. Adopting a Foucauldian notion of discursive production, this paper addresses the ways that the discursive ‘field’ of ‘principal’ (within larger regimes of truth such as schools, leadership, quality and efficiency) is produced. It explores how individual principals understand their roles and ethics within those practices of audit emerging in school governance, and how their self-regulation is constituted through NAPLAN – the National Assessment Program, Literacy and Numeracy. A key effect of NAPLAN has been the rise of auditing practices that change how education is valued. Open-ended interviews with 13 primary and secondary school principals from Western Australia, South Australia and New South Wales asked how they perceived NAPLAN's impact on their work, their relationships within their school community and their ethical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying unusual or anomalous patterns in an underlying dataset is an important but challenging task in many applications. The focus of the unsupervised anomaly detection literature has mostly been on vectorised data. However, many applications are more naturally described using higher-order tensor representations. Approaches that vectorise tensorial data can destroy the structural information encoded in the high-dimensional space, and lead to the problem of the curse of dimensionality. In this paper we present the first unsupervised tensorial anomaly detection method, along with a randomised version of our method. Our anomaly detection method, the One-class Support Tensor Machine (1STM), is a generalisation of conventional one-class Support Vector Machines to higher-order spaces. 1STM preserves the multiway structure of tensor data, while achieving significant improvement in accuracy and efficiency over conventional vectorised methods. We then leverage the theory of nonlinear random projections to propose the Randomised 1STM (R1STM). Our empirical analysis on several real and synthetic datasets shows that our R1STM algorithm delivers comparable or better accuracy to a state-of-the-art deep learning method and traditional kernelised approaches for anomaly detection, while being approximately 100 times faster in training and testing.