13 resultados para Standard information

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse the dynamics of a number of second order on-line learning algorithms training multi-layer neural networks, using the methods of statistical mechanics. We first consider on-line Newton's method, which is known to provide optimal asymptotic performance. We determine the asymptotic generalization error decay for a soft committee machine, which is shown to compare favourably with the result for standard gradient descent. Matrix momentum provides a practical approximation to this method by allowing an efficient inversion of the Hessian. We consider an idealized matrix momentum algorithm which requires access to the Hessian and find close correspondence with the dynamics of on-line Newton's method. In practice, the Hessian will not be known on-line and we therefore consider matrix momentum using a single example approximation to the Hessian. In this case good asymptotic performance may still be achieved, but the algorithm is now sensitive to parameter choice because of noise in the Hessian estimate. On-line Newton's method is not appropriate during the transient learning phase, since a suboptimal unstable fixed point of the gradient descent dynamics becomes stable for this algorithm. A principled alternative is to use Amari's natural gradient learning algorithm and we show how this method provides a significant reduction in learning time when compared to gradient descent, while retaining the asymptotic performance of on-line Newton's method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known 'background' process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the 'probability density function', 'pdf') of the data generated by the 'background' process. The relative proportion of this 'background' component (the 'prior' 'background' 'probability), the 'pdf' and the 'prior' probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known 'background' distribution. The method exploits the Kolmogorov-Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the 'oker' data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm. © Springer-Verlag 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of information in high-technology markets is critical (Dutta, Narasimhan and Rajiv 1999; Farrell and Saloner 1986; Weiss and Heide 1993). In these markets, the volatility and volume of information present managers and researchers with the considerable challenge of monitoring such information and examining how potential customers may respond to it. This article examines the effects of the type and volume of information on the market share of different technological standards in the Local Area Networks (LAN) industry. We identify three different types of information: technological, availability and adoption. Our empirical application suggests that all three types of information have significant effects on the market share of a technological standard, but their direction and magnitude differ. More specifically, technology-related information is negatively related to market share as it demonstrates that the underlying technology is immature and still evolving. Both availability and adoption-related information have a positive effect on market share, but the former is larger than the latter. We conclude that high-tech firms should emphasize the dissemination of information, especially availability-related, as part of their promotional strategy for a new technology. Otherwise, they may risk missing an opportunity to achieve a higher share and establish their market presence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the extensive use of pulse modulation methods in telecommunications, much work has been done in the search for a better utilisation of the transmission channel.The present research is an extension of these investigations. A new modulation method, 'Variable Time-Scale Information Processing', (VTSIP), is proposed.The basic principles of this system have been established, and the main advantages and disadvantages investigated. With the proposed system, comparison circuits detect the instants at which the input signal voltage crosses predetermined amplitude levels.The time intervals between these occurrences are measured digitally and the results are temporarily stored, before being transmitted.After reception, an inverse process enables the original signal to be reconstituted.The advantage of this system is that the irregularities in the rate of information contained in the input signal are smoothed out before transmission, allowing the use of a smaller transmission bandwidth. A disadvantage of the system is the time delay necessarily introduced by the storage process.Another disadvantage is a type of distortion caused by the finite store capacity.A simulation of the system has been made using a standard speech signal, to make some assessment of this distortion. It is concluded that the new system should be an improvement on existing pulse transmission systems, allowing the use of a smaller transmission bandwidth, but introducing a time delay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise, in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and; hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, I describe studies on fabrication, spectral characteristics and applications of tilted fibre gratings (TFGs) with small, large and 45° tilted structures and novel developments in fabrication of fibre Bragg gratings (FBGs) and long period gratings (LPGs) in normal silica and mid-infrared (mid-IR) glass fibres using near-IR femtosecond laser. One of the major contributions presented in this thesis is the systematic investigation of structures, inscription methods and spectral, polarisation dependent loss (PDL) and thermal characteristics of TFGs with small (<45°), large (>45°) and 45° tilted structures. I have experimentally characterised TFGs, obtaining relationships between the radiation angle, central wavelength of the radiation profile, Bragg resonance and the tilt angle, which are consistent with theoretical simulation based on the mode-coupling theory. Furthermore, thermal responses have been measured for these three types of TFGs, showing the transmission spectra of large and 45° TFGs are insensitive to the temperature change, unlike the normal and small angle tilted FBGs. Based on the distinctive optical properties, TFGs have been developed into interrogation system and sensors, which form the other significant contributions of the work presented in this thesis. The 10°-TFG based 800nm WDM interrogation system can function not just as an in-fibre spectrum analyser but also possess refractive index sensing capability. By utilising the unique polarisation properties, the 81 °-TFG based sensors are capable of sensing the transverse loading and twisting with sensitivities of 2.04pW/(kg/m) and 145.90pW/rad, repectively. The final but the most important contribution from the research work presented in this thesis is the development of novel grating inscription techniques using near-IR femtosecond laser. A number of LPGs and FBGs were successfully fabricated in normal silica and mid-IR glass fibres using point-by-point and phase-mask techniques. LPGs and 1st and 2nd order FBGs have been fabricated in these mid-IR glass fibres showing resonances covering the wavelength range from 1200 to 1700nm with the strengths up to 13dB. In addition, the thermal and strain sensitivities of these gratings have been systematically investigated. All the results from these initial but systematic works will provide useful function characteristics information for future fibre grating based devices and applications in mid-IR range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, road safety and traffic congestion are major concerns worldwide. This is why research on vehicular communication is very vital. In static scenarios vehicles behave typically like in an office network where nodes transmit without moving and with no defined position. This paper analyses the impact of context information on existing popular rate adaptation algorithms. Our simulation was done in MATLAB by observing the impact of context information on these algorithms. Simulation was performed for both static and mobile cases.Our simulations are based on IEEE 802.11p wireless standard. For static scenarios vehicles do not move and without defined positions, while for the mobile case, vehicles are mobile with uniformly selected speed and randomized positions. Network performance are analysed using context information. Our results show that in mobility when context information is used, the system performance can be improved for all three rate adaptation algorithms. That can be explained by that with range checking, when many vehicles are out of communication range, less vehicles contend for network resources, thereby increasing the network performances. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a text mining method called LRD (latent relation discovery), which extends the traditional vector space model of document representation in order to improve information retrieval (IR) on documents and document clustering. Our LRD method extracts terms and entities, such as person, organization, or project names, and discovers relationships between them by taking into account their co-occurrence in textual corpora. Given a target entity, LRD discovers other entities closely related to the target effectively and efficiently. With respect to such relatedness, a measure of relation strength between entities is defined. LRD uses relation strength to enhance the vector space model, and uses the enhanced vector space model for query based IR on documents and clustering documents in order to discover complex relationships among terms and entities. Our experiments on a standard dataset for query based IR shows that our LRD method performed significantly better than traditional vector space model and other five standard statistical methods for vector expansion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Heavy menstrual bleeding (HMB) is a common problem, yet evidence to inform decisions about initial medical treatment is limited. OBJECTIVES: To assess the clinical effectiveness and cost-effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) (Mirena(®), Bayer) compared with usual medical treatment, with exploration of women's perspectives on treatment. DESIGN: A pragmatic, multicentre randomised trial with an economic evaluation and a longitudinal qualitative study. SETTING: Women who presented in primary care. PARTICIPANTS: A total of 571 women with HMB. A purposeful sample of 27 women who were randomised or ineligible owing to treatment preference participated in semistructured face-to-face interviews around 2 and 12 months after commencing treatment. INTERVENTIONS: LNG-IUS or usual medical treatment (tranexamic acid, mefenamic acid, combined oestrogen-progestogen or progesterone alone). Women could subsequently swap or cease their allocated treatment. OUTCOME MEASURES: The primary outcome was the patient-reported score on the Menorrhagia Multi-Attribute Scale (MMAS) assessed over a 2-year period and then again at 5 years. Secondary outcomes included general quality of life (QoL), sexual activity, surgical intervention and safety. Data were analysed using iterative constant comparison. A state transition model-based cost-utility analysis was undertaken alongside the randomised trial. Quality-adjusted life-years (QALYs) were derived from the European Quality of Life-5 Dimensions (EQ-5D) and the Short Form questionnaire-6 Dimensions (SF-6D). The intention-to-treat analyses were reported as cost per QALY gained. Uncertainty was explored by conducting both deterministic and probabilistic sensitivity analyses. RESULTS: The MMAS total scores improved significantly in both groups at all time points, but were significantly greater for the LNG-IUS than for usual treatment [mean difference over 2 years was 13.4 points, 95% confidence interval (CI) 9.9 to 16.9 points; p < 0.001]. However, this difference between groups was reduced and no longer significant by 5 years (mean difference in scores 3.9 points, 95% CI -0.6 to 8.3 points; p = 0.09). By 5 years, only 47% of women had a LNG-IUS in place and 15% were still taking usual medical treatment. Five-year surgery rates were low, at 20%, and were similar, irrespective of initial treatments. There were no significant differences in serious adverse events between groups. Using the EQ-5D, at 2 years, the relative cost-effectiveness of the LNG-IUS compared with usual medical treatment was £1600 per QALY, which by 5 years was reduced to £114 per QALY. Using the SF-6D, usual medical treatment dominates the LNG-IUS. The qualitative findings show that women's experiences and expectations of medical treatments for HMB vary considerably and change over time. Women had high expectations of a prompt effect from medical treatments. CONCLUSIONS: The LNG-IUS, compared with usual medical therapies, resulted in greater improvement over 2 years in women's assessments of the effect of HMB on their daily routine, including work, social and family life, and psychological and physical well-being. At 5 years, the differences were no longer significant. A similar low proportion of women required surgical intervention in both groups. The LNG-IUS is cost-effective in both the short and medium term, using the method generally recommended by the National Institute for Health and Care Excellence. Using the alternative measures to value QoL will have a considerable impact on cost-effectiveness decisions. It will be important to explore the clinical and health-care trajectories of the ECLIPSE (clinical effectiveness and cost-effectiveness of levonorgestrel-releasing intrauterine system in primary care against standard treatment for menorrhagia) trial participants to 10 years, by which time half of the cohort will have reached menopause. TRIAL REGISTRATION: Current Controlled Trials ISRCTN86566246. FUNDING: This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 19, No. 88. See the NIHR Journals Library website for further project information.