750 resultados para Overhead squat


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmission of a 73.7 Tb/s (96x3x256-Gb/s) DP-16QAM mode-division-multiplexed signal over 119km of few-mode fiber transmission line incorporating an inline multi mode EDFA and a phase plate based mode (de-)multiplexer is demonstrated. Data-aided 6x6 MIMO digital signal processing was used to demodulate the signal. The total demonstrated net capacity, taking into account 20% of FEC-overhead and 7.5% additional overhead (Ethernet and training sequences), is 57.6 Tb/s, corresponding to a spectral efficiency of 12 bits/s/Hz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of soft state (i.e., the state that will expire unless been refreshed) has been widely used in the design of network signaling protocols. The approaches of refreshing state in multi-hop networks can be classified to end-to-end (E2E) and hop-by-hop (HbH) refreshes. In this article we propose an effective Markov chain based analytical model for both E2E and HbH refresh approaches. Simulations verify the analytical models, which can be used to study the impacts of link characteristics on the performance (e.g., state synchronization and message overhead), as a guide on configuration and optimization of soft state signaling protocols. © 2009 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose to increase residual carrier frequency offset tolerance based on short perfect reconstruction pulse shaping for coherent optical-orthogonal frequency division multiplexing. The proposed method suppresses the residual carrier frequency offset induced penalty at the receiver, without requiring any additional overhead and exhaustive signal processing. The Q-factor improvement contributed by the proposed method is 1.6 dB and 1.8 dB for time-frequency localization maximization and out-of-band energy minimization pulse shapes, respectively. Finally, the transmission span gain under the influence of residual carrier frequency offset is ̃62% with out-of-band energy minimization pulse shape. © 2014 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a novel phase noise estimation scheme for CO-OFDM, in which pilot subcarriers are deliberately correlated to the data subcarriers. This technique reduces the overhead by a factor of 2. © OSA 2014.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We experimentally demonstrate a novel fibre nonlinearity compensation technique for CO-OFDM based on phase-conjugated pilots (PCPs), showing that, by varying the PCP overhead a performance improvement up to 4 dB can be achieved allowing highly flexible adaptation to link characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included. © 2010 The authors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic asset rating (DAR) is one of the number of techniques that could be used to facilitate low carbon electricity network operation. Previous work has looked at this technique from an asset perspective. This paper focuses, instead, from a network perspective by proposing a dynamic network rating (DNR) approach. The models available for use with DAR are discussed and compared using measured load and weather data from a trial network area within Milton Keynes in the central area of the U.K. This paper then uses the most appropriate model to investigate, through a network case study, the potential gains in dynamic rating compared to static rating for the different network assets - transformers, overhead lines, and cables. This will inform the network operator of the potential DNR gains on an 11-kV network with all assets present and highlight the limiting assets within each season.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relatively high phase noise of coherent optical systems poses unique challenges for forward error correction (FEC). In this letter, we propose a novel semianalytical method for selecting combinations of interleaver lengths and binary Bose-Chaudhuri-Hocquenghem (BCH) codes that meet a target post-FEC bit error rate (BER). Our method requires only short pre-FEC simulations, based on which we design interleavers and codes analytically. It is applicable to pre-FEC BER ∼10-3, and any post-FEC BER. In addition, we show that there is a tradeoff between code overhead and interleaver delay. Finally, for a target of 10-5, numerical simulations show that interleaver-code combinations selected using our method have post-FEC BER around 2× target. The target BER is achieved with 0.1 dB extra signal-to-noise ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new method for solving some hard combinatorial optimization problems is suggested, admitting a certain reformulation. Considering such a problem, several different similar problems are prepared which have the same set of solutions. They are solved on computer in parallel until one of them will be solved, and that solution is accepted. Notwithstanding the evident overhead, the whole run-time could be significantly reduced due to dispersion of velocities of combinatorial search in regarded cases. The efficiency of this approach is investigated on the concrete problem of finding short solutions of non-deterministic system of linear logical equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actual text: I was recently at the Spanish College of Optometry biennial conference and attended a meeting of contact lens lecturers from around Spain and Portugal. We discussed various ideas, mainly about how to share good practice and improve standards. What came to my mind was ‘is there a blueprint for training trainers?’ Well probably not but there are many things that we need to acknowledge such as the way students learn for example. Many educators themselves were taught by lecturers who would write on a blackboard or use acetate on an overhead projector, then came the 35 mm slide era followed by the Powerpoint era. More recently there is a move towards a much more integrated approach of various teaching methods. At my university our contact lens and anterior eye lectures generally follow a format where a narrated Powerpoint lecture is uploaded onto our internal virtual learning environment. This narrated version of the slides is designed to give the didactic element of the topic. The students listen to that before attending an interactive seminar on that topic. The seminar is also recorded so that students can listen to that afterwards. The seminar is designed to give additional information, such as case reports, or to clarify key points or for live demonstrations. It is a good way of doubling the contact time with the students without imposing further on an already packed formal timetable as the students can work in their own time. One problem that we noticed with this approach was that attendance can vary. If the students feel that they will gain something from the interactive seminar then they are more likely to attend – exam tips usually win them over! At the Spanish meeting the educators decided that they wanted to have regular meetings. The industry colleagues in attendance said that they were happy to help but could not necessarily give money, but they could offer meeting rooms, pay for lunch and evening meals. They even said that that they were happy to host meetings and invite other companies too (except to manufacturing plants). In the UK the British Committee of Contact Lens Educators (BUCCLE) meets for one day on three occasions in the year. The American Optometric Contact Lens Educators (AOCLE) meets annually at a three day event. Both these organisations get some help from industry. BUCCLE usually has one of its meetings at a university, one at a company training centre/manufacturing plant/national headquarters and one meeting the day before the BCLA annual conference. BUCCLE usually has its pre-BCLA meeting in conjunction with the International Association of Contact Lens Educators (IACLE). So when educators meet what would they discuss; well probably the focus should be on education rather than actual contact lens knowledge. For example sharing ideas on how to teach toric lens fitting would be better than discussing the actual topic of toric lenses itself. Most universities will have an education department with an expert who could share ideas on how to use the internet in teaching or how to structure lectures or assessments etc. In the past I have helped with similar training programmes in other countries and sharing good practice in pedagogy is always a popular topic. Anyone who is involved in education in the field of contact lenses should look at the IACLE web page and look out for the IACLE World Congress in 2015 in the days preceding the BCLA. Finally, IACLE, AOCLE and BUCCLE all exist as a result of generous educational grants from contact lens companies and anyone interested in finding out more about should refer to their respective web pages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Location estimation is important for wireless sensor network (WSN) applications. In this paper we propose a Cramer-Rao Bound (CRB) based analytical approach for two centralized multi-hop localization algorithms to get insights into the error performance and its sensitivity to the distance measurement error, anchor node density and placement. The location estimation performance is compared with four distributed multi-hop localization algorithms by simulation to evaluate the efficiency of the proposed analytical approach. The numerical results demonstrate the complex tradeoff between the centralized and distributed localization algorithms on accuracy, complexity and communication overhead. Based on this analysis, an efficient and scalable performance evaluation tool can be designed for localization algorithms in large scale WSNs, where simulation-based evaluation approaches are impractical. © 2013 IEEE.