24 resultados para metadata schemes

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a theoretical investigation on applications of Raman effect in optical fibre communication as well as the design and optimisation of various Raman based devices and transmission schemes. The techniques used are mainly based on numerical modelling. The results presented in this thesis are divided into three main parts. First, novel designs of Raman fibre lasers (RFLs) based on Phosphosilicate core fibre are analysed and optimised for efficiency by using a discrete power balance model. The designs include a two stage RFL based on Phosphosilicate core fibre for telecommunication applications, a composite RFL for the 1.6 μm spectral window, and a multiple output wavelength RFL aimed to be used as a compact pump source for fiat gain Raman amplifiers. The use of Phosphosilicate core fibre is proven to effectively reduce the design complexity and hence leads to a better efficiency, stability and potentially lower cost. Second, the generalised Raman amplified gain model approach based on the power balance analysis and direct numerical simulation is developed. The approach can be used to effectively simulate optical transmission systems with distributed Raman amplification. Last, the potential employment of a hybrid amplification scheme, which is a combination between a distributed Raman amplifier and Erbium doped amplifier, is investigated by using the generalised Raman amplified gain model. The analysis focuses on the use of the scheme to upgrade a standard fibre network to 40 Gb/s system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent surveys reveal that many university students in the U.K. are not satisfied with the timeliness and usefulness of the feedback given by their tutors. Ensuring timeliness in marking can result in a reduction in the quality of feedback. Though suitable use of Information and Communication Technology should alleviate this problem, existing Virtual Learning Environments are inadequate to support detailed marking scheme creation and they provide little support for giving detailed feedback. This paper describes a unique new web-based tool called e-CAF for facilitating coursework assessment and feedback management directed by marking schemes. Using e-CAF, tutors can create or reuse detailed marking schemes efficiently without sacrificing the accuracy or thoroughness in marking. The flexibility in marking scheme design also makes it possible for tutors to modify a marking scheme during the marking process without having to reassess the students’ submissions. The resulting marking process will become more transparent to students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytical first order calculation of the impact of Gaussian white noise on a novel single Mach-Zehnder Interferometer demodulation scheme for DQPSK reveals a constant Q factor ratio to the conventional scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main aim of this thesis is to evaluate the economic and socio-economic viability of energy crops as raw material for bioenergy schemes at the local level. The case examined is Greece, a southern Mediterranean country. Based on the current state, on foreseen trends and on the information presented in the literature review (conducted at the beginning of the study), the main goal was defined as follows: To examine the evidence supporting a strong role for dedicated energy crops local bioenergy developments in Greece, a sector that is forecasted to be increasingly important in the short to medium term.' Two perennial energy crops, cardoon (Cynara cardunculus L.) and giant reed (Arundo donax L.) were evaluated. The thesis analysed their possible introduction in the agricultural system of Rhodope, northern Greece, as alternative land use, through comparative financial appraisal with the main conventional crops. Based on the output of this comparative analysis, the breakeven for the two selected energy crops was defined along with a sensitivity analysis for the risk of the potential implementation. Following, the author performed an economic and socio-economic evaluation of a district heating system fuelled with energy crops in the selected region. Finally, the author, acknowledging that bioenergy deployment should be studied in the context of innovations proceeded in examining the different perceptions of the key groups involved, farmers and potential end users. Results indicated that biomass exploitation for energy purposes is more likely to be accepted when it is seen clearly as one strand in a national energy, environmental and agricultural policy which embraces several sources of renewable energy, and which also encourages energy efficiency and conservation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current rate of global biodiversity loss led many governments to sign the international agreement ‘Halting Biodiversity Loss by 2010 and beyond’ in 2001. The UK government was one of these and has a number of methods to tackle this, such as: commissioning specific technical guidance and supporting the UK Biodiversity Acton Plan (BAP) targets. However, by far the most effective influence the government has upon current biodiversity levels is through the town planning system. This is due to the control it has over all phases of a new development scheme’s lifecycle.There is an increasing myriad of regulations, policies and legislation, which deal with biodiversity protection and enhancement across the hierarchical spectrum: from the global and European level, down to regional and local levels. With these drivers in place, coupled with the promotion of benefits and incentives, increasing biodiversity value ought to be an achievable goal on most, if not all development sites. However, in the professional world, this is not the case due to a number of obstructions. Many of these tend to be ‘process’ barriers, which are particularly prevalent with ‘urban’ and ‘major’ development schemes, and is where the focus of this research paper lies.The paper summarises and discusses the results of a questionnaire survey, regarding obstacles to maximising biodiversity enhancements on major urban development schemes. The questionnaire was completed by Local Government Ecologists in England. The paper additionally refers to insights from previous action research, specialist interviews, and case studies, to reveal the key process obstacles.Solutions to these obstacles are then alluded to and recommendations are made within the discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEEE 802.16 standard specifies two contention based bandwidth request schemes working with OFDM physical layer specification in point-to-multipoint (PMP) architecture, the mandatory one used in region-full and the optional one used in region-focused. This letter presents a unified analytical model to study the bandwidth efficiency and channel access delay performance of the two schemes. The impacts of access parameters, available bandwidth and subchannelization have been taken into account. The model is validated by simulations. The mandatory scheme is observed to perform closely to the optional one when subchannelization is active for both schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work numerically analyzes the performances of a 2R (reamplification and reshaping) regenerator based on a nonlinear optical loop mirror and a 3R (reamplification, reshaping, and retiming) regenerator using a nonlinearly enhanced amplitude modulator in 40-Gb/s standard single-mode fiber (SMF)-based optical networks with large amplifier spacing. The characteristics of one(600 km of SMF) and two-step regeneration are examined and the feasibility of wavelength-division multiplexing (WDM) operation is demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multicast is an efficient approach to save network bandwidth for multimedia streaming services. To provide Quality of Services (QoS) for the multimedia services while maintain the advantage of multicast in bandwidth efficiency, admission control for multicast sessions are expected. Probe-based multicast admission control (PBMAC) schemes are of a sort of scalable and simple admission control for multicast. Probing scheme is the essence of PBMAC. In this paper, after a detailed survey on three existing probing schemes, we evaluate these schemes using simulation and analysis approaches in two aspects: admission correctness and group scalability. Admission correctness of the schemes is compared by simulation investigation. Analytical models for group scalability are derived, and validated by simulation results. The evaluation results illustrate the advantages and weaknesses of each scheme, which are helpful for people to choose proper probing scheme for network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because poor quality semantic metadata can destroy the effectiveness of semantic web technology by hampering applications from producing accurate results, it is important to have frameworks that support their evaluation. However, there is no such framework developedto date. In this context, we proposed i) an evaluation reference model, SemRef, which sketches some fundamental principles for evaluating semantic metadata, and ii) an evaluation framework, SemEval, which provides a set of instruments to support the detection of quality problems and the collection of quality metrics for these problems. A preliminary case study of SemEval shows encouraging results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because metadata that underlies semantic web applications is gathered from distributed and heterogeneous data sources, it is important to ensure its quality (i.e., reduce duplicates, spelling errors, ambiguities). However, current infrastructures that acquire and integrate semantic data have only marginally addressed the issue of metadata quality. In this paper we present our metadata acquisition infrastructure, ASDI, which pays special attention to ensuring that high quality metadata is derived. Central to the architecture of ASDI is a verification engine that relies on several semantic web tools to check the quality of the derived data. We tested our prototype in the context of building a semantic web portal for our lab, KMi. An experimental evaluation comparing the automatically extracted data against manual annotations indicates that the verification engine enhances the quality of the extracted semantic metadata.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.