859 resultados para cost and benefit
Resumo:
The traffic carried by core optical networks grows at a steady but remarkable pace of 30-40% year-over-year. Optical transmissions and networking advancements continue to satisfy the traffic requirements by delivering the content over the network infrastructure in a cost and energy efficient manner. Such core optical networks serve the information traffic demands in a dynamic way, in response to requirements for shifting of traffics demands, both temporally (day/night) and spatially (business district/residential). However as we are approaching fundamental spectral efficiency limits of singlemode fibers, the scientific community is pursuing recently the development of an innovative, all-optical network architecture introducing the spatial degree of freedom when designing/operating future transport networks. Spacedivision- multiplexing through the use of bundled single mode fibers, and/or multi-core fibers and/or few-mode fibers can offer up to 100-fold capacity increase in future optical networks. The EU INSPACE project is working on the development of a complete spatial-spectral flexible optical networking solution, offering the network ultra-high capacity, flexibility and energy efficiency required to meet the challenges of delivering exponentially growing traffic demands in the internet over the next twenty years. In this paper we will present the motivation and main research activities of the INSPACE consortium towards the realization of the overall project solution. © 2014 Copyright SPIE.
Resumo:
The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.
Resumo:
In recent years, offshoring and outsourcing have transformed fundamentally nationally based auto sectors into global networks of design, production and distribution across the global value chains coordinated by the major automotive Original Equipment Manufacturers (OEMs). As manufacturing activities tended to be shifted to low-labour cost locations in Asia, Africa and Latin America, high-end design, R&D, product development have stayed anchored mostly to high-cost and high knowledge-intensive home economy locations (perhaps with the except of some design and styling activities which are often located in major end markets around the world. However, very recently the weaknesses of and risks inherent in such global value chains (GVCs) have been exposed, triggering attempts to rethink their nature and also raising possibilities to reshore some manufacturing activities to home countries. A combination of a more competitive exchange rate (despite the very recent appreciation of sterling), increased transport costs, rising wages in key areas of China, and a greater awareness of supply chain resilience have all contributed to a perceived change in some business fundamentals. The potential for some supply chain relocalisation also links in with the servitisation of manufacturing including the auto sector and shift to a hybrid model where manufacturing and services are increasingly intertwined. However, there are limits as to how far this can go and these raise some important questions and issues over the possible role for industrial policy.
Resumo:
Fiber Bragg gratings can be used for monitoring different parameters in a wide variety of materials and constructions. The interrogation of fiber Bragg gratings traditionally consists of an expensive and spacious peak tracking or spectrum analyzing unit which needs to be deployed outside the monitored structure. We present a dynamic low-cost interrogation system for fiber Bragg gratings which can be integrated with the fiber itself, limiting the fragile optical in- and outcoupling interfaces and providing a compact, unobtrusive driving and read-out unit. The reported system is based on an embedded Vertical Cavity Surface Emitting Laser (VCSEL) which is tuned dynamically at 1 kHz and an embedded photodiode. Fiber coupling is provided through a dedicated 45° micromirror yielding a 90° in-the-plane coupling and limiting the total thickness of the fiber coupled optoelectronic package to 550 µm. The red-shift of the VCSEL wavelength is providing a full reconstruction of the spectrum with a range of 2.5 nm. A few-mode fiber with fiber Bragg gratings at 850 nm is used to prove the feasibility of this low-cost and ultra-compact interrogation approach.
Resumo:
Fiber to the premises has promised to increase the capacity in telecommunications access networks for well over 30 years. While it is widely recognized that optical-fiber-based access networks will be a necessity in the shortto medium-term future, its large upfront cost and regulatory issues are pushing many operators to further postpone its deployment, while installing intermediate unambitious solutions such as fiber to the cabinet. Such high investment cost of both network access and core capacity upgrade often derives from poor planning strategies that do not consider the necessity to adequately modify the network architecture to fully exploit the cost benefit that a fiber-centric solution can bring. DISCUS is a European Framework 7 Integrated Project that, building on optical-centric solutions such as long-reach passive optical access and flat optical core, aims to deliver a cost-effective architecture for ubiquitous broadband services. DISCUS analyzes, designs, and demonstrates end-to-end architectures and technologies capable of saving cost and energy by reducing the number of electronic terminations in the network and sharing the deployment costs among a larger number of users compared to current fiber access systems. This article describes the network architecture and the supporting technologies behind DISCUS, giving an overview of the concepts and methodologies that will be used to deliver our end-to-end network solution. © 2013 IEEE.
Resumo:
Tool life is an important factor to be considered during the optimisation of a machining process since cutting parameters can be adjusted to optimise tool changing, reducing cost and time of production. Also the performance of a tool is directly linked to the generated surface roughness and this is important in cases where there are strict surface quality requirements. The prediction of tool life and the resulting surface roughness in milling operations has attracted considerable research efforts. The research reported herein is focused on defining the influence of milling cutting parameters such as cutting speed, feed rate and axial depth of cut, on three major tool performance parameters namely, tool life, material removal and surface roughness. The research is seeking to define methods that will allow the selection of optimal parameters for best tool performance when face milling 416 stainless steel bars. For this study the Taguchi method was applied in a special design of an orthogonal array that allows studying the entire parameter space with only a number of experiments representing savings in cost and time of experiments. The findings were that the cutting speed has the most influence on tool life and surface roughness and very limited influence on material removal. By last tool life can be judged either from tool life or volume of material removal.
Resumo:
A simple fiber sensor capable of simultaneous measurement of liquid level and refractive index (RI) is proposed and experimentally demonstrated. The sensing head is an all-fiber modal interferometer manufactured by splicing an uncoated single-mode fiber with two short sections of multimode fiber. The interference pattern experiences blue shift along with an increase of axial strain and surrounding RI. Owing to the participation of multiple cladding modes with different sensitivities, the height and RI of the liquid could be simultaneously measured by monitoring two dips of the transmission spectrum. Experimental results show that the liquid level and RI sensitivities of the two dips are 245.7 pm/mm, -38 nm/RI unit (RIU), and 223.7 pm/mm, -62 nm/RIU, respectively. The approach has distinctive advantages of easy fabrication, low cost, and high sensitivity for liquid level detection with the capability of distinguishing the RI variation simultaneously. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
Background: Recent attention on chemotherapeutic intervention against cancer has been focused on discovering and developing phytochemicals as anticancer agents with improved efficacy, low drug resistance and toxicity, low cost and limited adverse side effects. In this study, we investigated the effects of Curcuma C20-dialdehyde on growth, apoptosis and cell cycle arrest in colon and cervical cancer cell lines. Materials and Methods: Antiproliferative, apoptosis induction, and cell cycle arrest activities of Curcuma C20-dialdehyde were determined by WST cell proliferation assay, flow cytometric Alexa fluor 488-annexin V/propidium iodide (PI) staining and PI staining, respectively. Results: Curcuma C20 dialdehyde suppressed the proliferation of HCT116, HT29 and HeLa cells, with IC50 values of 65.4±1.74 μg/ml, 58.4±5.20 μg/ml and 72.0±0.03 μg/ml, respectively, with 72 h exposure. Flow cytometric analysis revealed that percentages of early apoptotic cells increased in a dose-dependent manner upon exposure to Curcuma C20-dialdehyde. Furthermore, exposure to lower concentrations of this compound significantly induced cell cycle arrest at G1 phase for both HCT116 and HT29 cells, while higher concentrations increased sub-G1 populations. However, the concentrations used in this study could not induce cell cycle arrest but rather induced apoptotic cell death in HeLa cells. Conclusions: Our findings suggest that the phytochemical Curcuma C20-dialdehyde may be a potential antineoplastic agent for colon and cervical cancer chemotherapy and/or chemoprevention. Further studies are needed to characterize the drug target or mode of action of the Curcuma C20-dialdehyde as an anticancer agent.
Resumo:
Firms worldwide are taking major initiatives to reduce the carbon footprint of their supply chains in response to the growing governmental and consumer pressures. In real life, these supply chains face stochastic and non-stationary demand but most of the studies on inventory lot-sizing problem with emission concerns consider deterministic demand. In this paper, we study the inventory lot-sizing problem under non-stationary stochastic demand condition with emission and cycle service level constraints considering carbon cap-and-trade regulatory mechanism. Using a mixed integer linear programming model, this paper aims to investigate the effects of emission parameters, product- and system-related features on the supply chain performance through extensive computational experiments to cover general type business settings and not a specific scenario. Results show that cycle service level and demand coefficient of variation have significant impacts on total cost and emission irrespective of level of demand variability while the impact of product's demand pattern is significant only at lower level of demand variability. Finally, results also show that increasing value of carbon price reduces total cost, total emission and total inventory and the scope of emission reduction by increasing carbon price is greater at higher levels of cycle service level and demand coefficient of variation. The analysis of results helps supply chain managers to take right decision in different demand and service level situations.
Resumo:
Local air quality was one of the main stimulants for low carbon vehicle development during the 1990s. Issues of national fuel security and global air quality (climate change) have added pressure for their development, stimulating schemes to facilitate their deployment in the UK. In this case study, Coventry City Council aimed to adopt an in-house fleet of electric and hybrid-electric vehicles to replace business mileage paid for in employee's private vehicles. This study made comparisons between the proposed vehicle technologies, in terms of costs and air quality, over projected scenarios of typical use. The study found that under 2009 conditions, the electric and hybrid fleet could not compete on cost with the current business model because of untested assumptions, but certain emissions were significantly reduced >50%. Climate change gas emissions were most drastically reduced where electric vehicles were adopted because the electricity supply was generated by renewable energy sources. The study identified the key cost barriers and benefits to adoption of low-emission vehicles in current conditions in the Coventry fleet. Low-emission vehicles achieved significant air pollution-associated health cost and atmospheric emission reductions per vehicle, and widespread adoption in cities could deliver significant change. © The Author 2011. Published by Oxford University Press. All rights reserved.
Resumo:
We experimentally demonstrate an all-fiber loading sensor system based on a 45° and an 81° tilted fiber grating (TFG). We have fabricated two TFGs adjacent to each other in a single fiber to form a hybrid structure. When the transverse load applied to the 81° TFG, the light coupling to the two orthogonally polarized modes will interchange the power according to the load applied to the fiber, which provides a solution to measure the load. For real applications, we further investigated the interrogation of this all-fiber loading sensor system using a low-cost and compact-size single wavelength source and a power meter. The experimental results have clearly shown that a low-cost high-sensitivity loading sensor system can be developed based on the proposed TFG configuration.
Resumo:
In this talk we will review some of the key enabling technologies of optical communications and potential future bottlenecks. Single mode fibre (SMF) has long been the preferred waveguide for long distance communication. This is largely due to low loss, low cost and relative linearity over a wide bandwidth. As capacity demands have grown SMF has largely been able to keep pace with demand. Several groups have been identifying the possibility of exhausting the bandwidth provided by SMF [1,2,3]. This so called “capacity-crunch” has potentially vast economic and social consequences and will be discussed in detail. As demand grows optical power launched into the fibre has the potential to cause nonlinearities that can be detrimental to transmission. There has been considerable work done on identifying this nonlinear limit [4, 5] with a strong re- search interest currently on the topic of nonlinear compensation [6, 7]. Embracing and compensating for nonlinear transmission is one potential solution that may extend the lifetime of the current waveguide technology. However, at sufficiently high powers the waveguide will fail due to heat-induced mechanical failure. Moving forward it be- comes necessary to address the waveguide itself with several promising contenders discussed, including few-mode fibre and multi-core fibre.
Resumo:
Desalination is a costly means of providing freshwater. Most desalination plants use either reverse osmosis (RO) or thermal distillation. Both processes have drawbacks: RO is efficient but uses expensive electrical energy; thermal distillation is inefficient but uses less expensive thermal energy. This work aims to provide an efficient RO plant that uses thermal energy. A steam-Rankine cycle has been designed to drive mechanically a batch-RO system that achieves high recovery, without the high energy penalty typically incurred in a continuous-RO system. The steam may be generated by solar panels, biomass boilers, or as an industrial by-product. A novel mechanical arrangement has been designed for low cost, and a steam-jacketed arrangement has been designed for isothermal expansion and improved thermodynamic efficiency. Based on detailed heat transfer and cost calculations, a gain output ratio of 69-162 is predicted, enabling water to be treated at a cost of 71 Indian Rupees/m3 at small scale. Costs will reduce with scale-up. Plants may be designed for a wide range of outputs, from 5 m3/day, up to commercial versions producing 300 m3/day of clean water from brackish groundwater.
Resumo:
Random fiber lasers blend together attractive features of traditional random lasers, such as low cost and simplicity of fabrication, with high-performance characteristics of conventional fiber lasers, such as good directionality and high efficiency. Low coherence of random lasers is important for speckle-free imaging applications. The random fiber laser with distributed feedback proposed in 2010 led to a quickly developing class of light sources that utilize inherent optical fiber disorder in the form of the Rayleigh scattering and distributed Raman gain. The random fiber laser is an interesting and practically important example of a photonic device based on exploitation of optical medium disorder. We provide an overview of recent advances in this field, including high-power and high-efficiency generation, spectral and statistical properties of random fiber lasers, nonlinear kinetic theory of such systems, and emerging applications in telecommunications and distributed sensing.
Resumo:
People depend on various sources of information when trying to verify their autobiographical memories. Yet recent research shows that people prefer to use cheap-and-easy verification strategies, even when these strategies are not reliable. We examined the robustness of this cheap strategy bias, with scenarios designed to encourage greater emphasis on source reliability. In three experiments, subjects described real (Experiments 1 and 2) or hypothetical (Experiment 3) autobiographical events, and proposed strategies they might use to verify their memories of those events. Subjects also rated the reliability, cost, and the likelihood that they would use each strategy. In line with previous work, we found that the preference for cheap information held when people described how they would verify childhood or recent memories (Experiment 1); personally-important or trivial memories (Experiment 2), and even when the consequences of relying on incorrect information could be significant (Experiment 3). Taken together, our findings fit with an account of source monitoring in which the tendency to trust one’s own autobiographical memories can discourage people from systematically testing or accepting strong disconfirmatory evidence.