32 resultados para Effective teaching -- Computer network resources

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation is carried out into the design of a small local computer network for eventual implementation on the University of Aston campus. Microprocessors are investigated as a possible choice for use as a node controller for reasons of cost and reliability. Since the network will be local, high speed lines of megabit order are proposed. After an introduction to several well known networks, various aspects of networks are discussed including packet switching, functions of a node and host-node protocol. Chapter three develops the network philosophy with an introduction to microprocessors. Various organisations of microprocessors into multicomputer and multiprocessor systems are discussed, together with methods of achieving reliabls computing. Chapter four presents the simulation model and its implentation as a computer program. The major modelling effort is to study the behaviour of messages queueing for access to the network and the message delay experienced on the network. Use is made of spectral analysis to determine the sampling frequency while Sxponentially Weighted Noving Averages are used for data smoothing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns the development of coordination and co-governance within three different regeneration programmes within one Midlands city over the period from 1999 to 2002. The New Labour government, in office since 1997, had an agenda for ‘joining-up’ government, part of which has had considerable impact in the area of regeneration policy. Joining-up government encompasses a set of related activities which can include the coordination of policy-making and service delivery. In regeneration, it also includes a commitment to operate through co-governance. Central government and local and regional organisations have sought to put this idea into practice by using what may be referred to as network management processes. Many characteristics of new policies are designed to address the management of networks. Network management is not new in this area, it has developed at least since the early 1990s with the City Challenge and Single Regeneration Budget (SRB) programmes as a way of encouraging more inclusive and effective regeneration interventions. Network management theory suggests that better management can improve decision-making outcomes in complex networks. The theories and concepts are utilised in three case studies as a way of understanding how and why regeneration attempts demonstrate real advances in inter-organisational working at certain times whilst faltering at others. Current cases are compared to the historical case of the original SRB programme as a method of assessing change. The findings suggest that: The use of network management can be identified at all levels of governance. As previous literature has highlighted, central government is the most important actor regarding network structuring. However, it can be argued that network structuring and game management are both practised by central and local actors; Furthermore, all three of the theoretical perspectives within network management (Instrumental, Institutional and Interactive), have been identified within UK regeneration networks. All may have a role to play with no single perspective likely to succeed on its own. Therefore, all could make an important contribution to the understanding of how groups can be brought together to work jointly; The findings support Klijn’s (1997) assertion that the institutional perspective is dominant for understanding network management processes; Instrumentalism continues on all sides, as the acquisition of resources remains the major driver for partnership activity; The level of interaction appears to be low despite the intentions for interactive decision-making; Overall, network management remains partial. Little attention is paid to the issues of accountability or to the institutional structures which can prevent networks from implementing the policies designed by central government, and/or the regional tier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Initially the study focussed on the factors affecting the ability of the police to solve crimes. An analysts of over twenty thousand police deployments revealed the proportion of time spent investigating crime contrasted to its perceived importance and the time spent on other activities. The fictional portrayal of skills believed important in successful crime investigation were identified and compared to the professional training and 'taught skills’ given to police and detectives. Police practitioners and middle management provided views on the skills needed to solve crimes. The relative importance of the forensic science role. fingerprint examination and interrogation skills were contrasted with changes in police methods resulting from the Police and Criminal Evidence Act and its effect on confessions. The study revealed that existing police systems for investigating crime excluding specifically cases of murder and other serious offences, were unsystematic, uncoordinated, unsupervised and unproductive in using police resources. The study examined relevant and contemporary research in the United States and United Kingdom and with organisational support introduced an experimental system of data capture and initial investigation with features of case screening and management. Preliminary results indicated increases in the collection of essential information and more effective use of investigative resources. In the managerial framework within which this study has been conducted, research has been undertaken in the knowledge elicitation area as a basis for an expert system of crime investigation and the potential organisational benefits of utilising the Lap computer in the first stages of data gathering and investigation. The conclusions demonstrate the need for a totally integrated system of criminal investigation with emphasis on an organisational rather than individual response. In some areas the evidence produced is sufficient to warrant replication, in others additional research is needed to further explore other concepts and proposed systems pioneered by this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, road safety and traffic congestion are major concerns worldwide. This is why research on vehicular communication is very vital. In static scenarios vehicles behave typically like in an office network where nodes transmit without moving and with no defined position. This paper analyses the impact of context information on existing popular rate adaptation algorithms. Our simulation was done in MATLAB by observing the impact of context information on these algorithms. Simulation was performed for both static and mobile cases.Our simulations are based on IEEE 802.11p wireless standard. For static scenarios vehicles do not move and without defined positions, while for the mobile case, vehicles are mobile with uniformly selected speed and randomized positions. Network performance are analysed using context information. Our results show that in mobility when context information is used, the system performance can be improved for all three rate adaptation algorithms. That can be explained by that with range checking, when many vehicles are out of communication range, less vehicles contend for network resources, thereby increasing the network performances. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A framework that aims to best utilize the mobile network resources for video applications is presented in this paper. The main contribution of the work proposed is the QoE-driven optimization method that can maintain a desired trade-off between fairness and efficiency in allocating resources in terms of data rates to video streaming users in LTE networks. This method is concerned with the control of the user satisfaction level from the service continuity's point of view and applies appropriate QoE metrics (Pause Intensity and variations) to determine the scheduling strategies in combination with the mechanisms used for adaptive video streaming such as 3GP/MPEG-DASH. The superiority of the proposed algorithms are demonstrated, showing how the resources of a mobile network can be optimally utilized by using quantifiable QoE measurements. This approach can also find the best match between demand and supply in the process of network resource distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In wireless sensor networks where nodes are powered by batteries, it is critical to prolong the network lifetime by minimizing the energy consumption of each node. In this paper, the cooperative multiple-input-multiple-output (MIMO) and data-aggregation techniques are jointly adopted to reduce the energy consumption per bit in wireless sensor networks by reducing the amount of data for transmission and better using network resources through cooperative communication. For this purpose, we derive a new energy model that considers the correlation between data generated by nodes and the distance between them for a cluster-based sensor network by employing the combined techniques. Using this model, the effect of the cluster size on the average energy consumption per node can be analyzed. It is shown that the energy efficiency of the network can significantly be enhanced in cooperative MIMO systems with data aggregation, compared with either cooperative MIMO systems without data aggregation or data-aggregation systems without cooperative MIMO, if sensor nodes are properly clusterized. Both centralized and distributed data-aggregation schemes for the cooperating nodes to exchange and compress their data are also proposed and appraised, which lead to diverse impacts of data correlation on the energy performance of the integrated cooperative MIMO and data-aggregation systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work reports the developnent of a mathenatical model and distributed, multi variable computer-control for a pilot plant double-effect climbing-film evaporator. A distributed-parameter model of the plant has been developed and the time-domain model transformed into the Laplace domain. The model has been further transformed into an integral domain conforming to an algebraic ring of polynomials, to eliminate the transcendental terms which arise in the Laplace domain due to the distributed nature of the plant model. This has made possible the application of linear control theories to a set of linear-partial differential equations. The models obtained have well tracked the experimental results of the plant. A distributed-computer network has been interfaced with the plant to implement digital controllers in a hierarchical structure. A modern rnultivariable Wiener-Hopf controller has been applled to the plant model. The application has revealed a limitation condition that the plant matrix should be positive-definite along the infinite frequency axis. A new multi variable control theory has emerged fram this study, which avoids the above limitation. The controller has the structure of the modern Wiener-Hopf controller, but with a unique feature enabling a designer to specify the closed-loop poles in advance and to shape the sensitivity matrix as required. In this way, the method treats directly the interaction problems found in the chemical processes with good tracking and regulation performances. Though the ability of the analytical design methods to determine once and for all whether a given set of specifications can be met is one of its chief advantages over the conventional trial-and-error design procedures. However, one disadvantage that offsets to some degree the enormous advantages is the relatively complicated algebra that must be employed in working out all but the simplest problem. Mathematical algorithms and computer software have been developed to treat some of the mathematical operations defined over the integral domain, such as matrix fraction description, spectral factorization, the Bezout identity, and the general manipulation of polynomial matrices. Hence, the design problems of Wiener-Hopf type of controllers and other similar algebraic design methods can be easily solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

B-ISDN is a universal network which supports diverse mixes of service, applications and traffic. ATM has been accepted world-wide as the transport technique for future use in B-ISDN. ATM, being a simple packet oriented transfer technique, provides a flexible means for supporting a continuum of transport rates and is efficient due to possible statistical sharing of network resources by multiple users. In order to fully exploit the potential statistical gain, while at the same time provide diverse service and traffic mixes, an efficient traffic control must be designed. Traffic controls which include congestion and flow control are a fundamental necessity to the success and viability of future B-ISDN. Congestion and flow control is difficult in the broadband environment due to the high speed link, the wide area distance, diverse service requirements and diverse traffic characteristics. Most congestion and flow control approaches in conventional packet switched networks are reactive in nature and are not applicable in the B-ISDN environment. In this research, traffic control procedures mainly based on preventive measures for a private ATM-based network are proposed and their performance evaluated. The various traffic controls include CAC, traffic flow enforcement, priority control and an explicit feedback mechanism. These functions operate at call level and cell level. They are carried out distributively by the end terminals, the network access points and the internal elements of the network. During the connection set-up phase, the CAC decides the acceptance or denial of a connection request and allocates bandwidth to the new connection according to three schemes; peak bit rate, statistical rate and average bit rate. The statistical multiplexing rate is based on a `bufferless fluid flow model' which is simple and robust. The allocation of an average bit rate to data traffic at the expense of delay obviously improves the network bandwidth utilisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical methods in systematic conservation planning (SCP) represent a significant step toward cost-effective, transparent allocation of resources for biodiversity conservation. However, research demonstrates important consequences of uncertainties in SCP. Current research often relies on simplified case studies with unknown forms and amounts of uncertainty and low statistical power for generalizing results. Consequently, conservation managers have little evidence for the true performance of conservation planning methods in their own complex, uncertain applications. SCP needs to build evidence for predictive models of error and robustness to multiple, simultaneous uncertainties across a wide range of problems of known complexity. Only then can we determine true performance rather than how a method appears to perform on data with unknown uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis discusses and assesses the resources available to Asian entrepreneurs in the West Midlands' clothing industry and how they are used by these small businessmen in order to address opportunities in the market economy within the constraints imposed. The fashion industry is volatile and is dependent upon flexible firms which can respond quickly to shortrun production schedules. Small firms are best able to respond to this market environment. Production of jeans presents an interesting departure from the mainstream fashion industry. It is traditionally gared towards longrun production schedules where multinational enterprises have artificially diversified the market, promoting the 'right' brand name and have established control of the upper end of the market, whilst imports from Newly Developing Countries have catered for cheap copies at the lower end of the market. In recent years, a fashion element to jeans has emerged, thus opening a market gap for U.K. manufacturers to respond in the same way as for other fashion articles. A large immigrant population, previously serving the now declining factories and foundries of the West Midlands but, through redundancy, no longer a part of this employment sector, has ~5ponded to economic constraints and market opportunities by drawing on ethnic network resources for competitive access to labour, finance and contacts, to attack the emergent market gap. Two models of these Asian entrepreneurs are developed. One being somecne who has professionally and actively tackled the market gap and become established. These entrepreneurs are usually educated and have personal experience in business and were amongst the first to perceive opportunities to enter the industry, actively utilising their ethnicity as a resource upon which to draw for favorable access to cheap, flexible labour and capital. The second model is composed of later entrants to jeans manufacturing. They have less formal education and experience and have been pushed into self-employment by constraints of unemployment. Their ethnicity is passively used as a resource. They are more likely confined to the marginal activity of 'cut make and trim' and have little opportunity to increase profit margins, become estalished or expand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a joint load balancing and hotspot mitigation protocol for mobile ad-hoc network (MANET) termed by us as 'load_energy balance + hotspot mitigation protocol (LEB+HM)'. We argue that although ad-hoc wireless networks have limited network resources - bandwidth and power, prone to frequent link/node failures and have high security risk; existing ad hoc routing protocols do not put emphasis on maintaining robust link/node, efficient use of network resources and on maintaining the security of the network. Typical route selection metrics used by existing ad hoc routing protocols are shortest hop, shortest delay, and loop avoidance. These routing philosophy have the tendency to cause traffic concentration on certain regions or nodes, leading to heavy contention, congestion and resource exhaustion which in turn may result in increased end-to-end delay, packet loss and faster battery power depletion, degrading the overall performance of the network. Also in most existing on-demand ad hoc routing protocols intermediate nodes are allowed to send route reply RREP to source in response to a route request RREQ. In such situation a malicious node can send a false optimal route to the source so that data packets sent will be directed to or through it, and tamper with them as wish. It is therefore desirable to adopt routing schemes which can dynamically disperse traffic load, able to detect and remove any possible bottlenecks and provide some form of security to the network. In this paper we propose a combine adaptive load_energy balancing and hotspot mitigation scheme that aims at evenly distributing network traffic load and energy, mitigate against any possible occurrence of hotspot and provide some form of security to the network. This combine approach is expected to yield high reliability, availability and robustness, that best suits any dynamic and scalable ad hoc network environment. Dynamic source routing (DSR) was use as our underlying protocol for the implementation of our algorithm. Simulation comparison of our protocol to that of original DSR shows that our protocol has reduced node/link failure, even distribution of battery energy, and better network service efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flexible optical networking is identified today as the solution that offers smooth system upgradability towards Tb/s capacities and optimized use of network resources. However, in order to fully exploit the potentials of flexible spectrum allocation and networking, the development of a flexible switching node is required capable to adaptively add, drop and switch tributaries with variable bandwidth characteristics from/to ultra-high capacity wavelength channels at the lowest switching granularity. This paper presents the main concept and technology solutions envisioned by the EU funded project FOX-C, which targets the design, development and evaluation of the first functional system prototype of flexible add-drop and switching cross-connects. The key developments enable ultra-fine switching granularity at the optical subcarrier level, providing end-to-end routing of any tributary channel with flexible bandwidth down to 10Gb/s (or even lower) carried over wavelength superchannels, each with an aggregated capacity beyond 1Tb/s. © 2014 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughput plays a vital role for data transfer in Vehicular Networks which is useful for both safety and non-safety applications. An algorithm that adapts to mobile environment by using Context information has been proposed in this paper. Since one of the problems of existing rate adaptation algorithm is underutilization of link capacity in Vehicular environments, we have demonstrated that in wireless and mobile environments, vehicles can adapt to high mobility link condition and still perform better due to regular vehicles that will be out of communication range due to range checking and then de-congest the network thereby making the system perform better since fewer vehicles will contend for network resources. In this paper, we have design, implement and analyze ACARS, a more robust algorithm with significant increase in throughput performance and energy efficiency in the mist of high mobility of vehicles.