948 resultados para Data reliability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to show that reliability analysis and its implementation will lead to an improved whole life performance of the building systems, and hence their life cycle costs (LCC). Design/methodology/approach – This paper analyses reliability impacts on the whole life cycle of building systems, and reviews the up-to-date approaches adopted in UK construction, based on questionnaires designed to investigate the use of reliability within the industry. Findings – Approaches to reliability design and maintainability design have been introduced from the operating environment level, system structural level and component level, and a scheduled maintenance logic tree is modified based on the model developed by Pride. Different stages of the whole life cycle of building services systems, reliability-associated factors should be considered to ensure the system's whole life performance. It is suggested that data analysis should be applied in reliability design, maintainability design, and maintenance policy development. Originality/value – The paper presents important factors in different stages of the whole life cycle of the systems, and reliability and maintainability design approaches which can be helpful for building services system designers. The survey from the questionnaires provides the designers with understanding of key impacting factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palaeodata in synthesis form are needed as benchmarks for the Palaeoclimate Modelling Intercomparison Project (PMIP). Advances since the last synthesis of terrestrial palaeodata from the last glacial maximum (LGM) call for a new evaluation, especially of data from the tropics. Here pollen, plant-macrofossil, lake-level, noble gas (from groundwater) and δ18O (from speleothems) data are compiled for 18±2 ka (14C), 32 °N–33 °S. The reliability of the data was evaluated using explicit criteria and some types of data were re-analysed using consistent methods in order to derive a set of mutually consistent palaeoclimate estimates of mean temperature of the coldest month (MTCO), mean annual temperature (MAT), plant available moisture (PAM) and runoff (P-E). Cold-month temperature (MAT) anomalies from plant data range from −1 to −2 K near sea level in Indonesia and the S Pacific, through −6 to −8 K at many high-elevation sites to −8 to −15 K in S China and the SE USA. MAT anomalies from groundwater or speleothems seem more uniform (−4 to −6 K), but the data are as yet sparse; a clear divergence between MAT and cold-month estimates from the same region is seen only in the SE USA, where cold-air advection is expected to have enhanced cooling in winter. Regression of all cold-month anomalies against site elevation yielded an estimated average cooling of −2.5 to −3 K at modern sea level, increasing to ≈−6 K by 3000 m. However, Neotropical sites showed larger than the average sea-level cooling (−5 to −6 K) and a non-significant elevation effect, whereas W and S Pacific sites showed much less sea-level cooling (−1 K) and a stronger elevation effect. These findings support the inference that tropical sea-surface temperatures (SSTs) were lower than the CLIMAP estimates, but they limit the plausible average tropical sea-surface cooling, and they support the existence of CLIMAP-like geographic patterns in SST anomalies. Trends of PAM and lake levels indicate wet LGM conditions in the W USA, and at the highest elevations, with generally dry conditions elsewhere. These results suggest a colder-than-present ocean surface producing a weaker hydrological cycle, more arid continents, and arguably steeper-than-present terrestrial lapse rates. Such linkages are supported by recent observations on freezing-level height and tropical SSTs; moreover, simulations of “greenhouse” and LGM climates point to several possible feedback processes by which low-level temperature anomalies might be amplified aloft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environment monitoring applications using Wireless Sensor Networks (WSNs) have had a lot of attention in recent years. In much of this research tasks like sensor data processing, environment states and events decision making and emergency message sending are done by a remote server. A proposed cross layer protocol for two different applications where, reliability for delivered data, delay and life time of the network need to be considered, has been simulated and the results are presented in this paper. A WSN designed for the proposed applications needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from source nodes to the sink. A cross layer based on the design given in [1] has been extended and simulated for the proposed applications, with new features, such as routes discovery algorithms added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prior literature showed that Felder and Silverman learning styles model (FSLSM) was widely adopted to cater to individual styles of learners whether in traditional or Technology Enhanced Learning (TEL). In order to infer this model, the Index of Learning Styles (ILS) instrument was proposed. This research aims to analyse the soundness of this instrument in an Arabic sample. Data were integrated from different courses and years. A total of 259 engineering students participated voluntarily in the study. The reliability was analysed by applying internal construct reliability, inter-scale correlation, and total item correlation. The construct validity was also considered by running factor analysis. The overall results indicated that the reliability and validity of perception and input dimensions were moderately supported, whereas processing and understanding dimensions showed low internal-construct consistency and their items were weakly loaded in the associated constructs. Generally, the instrument needs further effort to improve its soundness. However, considering the consistency of the produced results of engineering students irrespective of cross-cultural differences, it can be adopted to diagnose learning styles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An ability to quantify the reliability of probabilistic flood inundation predictions is a requirement not only for guiding model development but also for their successful application. Probabilistic flood inundation predictions are usually produced by choosing a method of weighting the model parameter space, but previous study suggests that this choice leads to clear differences in inundation probabilities. This study aims to address the evaluation of the reliability of these probabilistic predictions. However, a lack of an adequate number of observations of flood inundation for a catchment limits the application of conventional methods of evaluating predictive reliability. Consequently, attempts have been made to assess the reliability of probabilistic predictions using multiple observations from a single flood event. Here, a LISFLOOD-FP hydraulic model of an extreme (>1 in 1000 years) flood event in Cockermouth, UK, is constructed and calibrated using multiple performance measures from both peak flood wrack mark data and aerial photography captured post-peak. These measures are used in weighting the parameter space to produce multiple probabilistic predictions for the event. Two methods of assessing the reliability of these probabilistic predictions using limited observations are utilized; an existing method assessing the binary pattern of flooding, and a method developed in this paper to assess predictions of water surface elevation. This study finds that the water surface elevation method has both a better diagnostic and discriminatory ability, but this result is likely to be sensitive to the unknown uncertainties in the upstream boundary condition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) have been an exciting topic in recent years. The services offered by a WSN can be classified into three major categories: monitoring, alerting, and information on demand. WSNs have been used for a variety of applications related to the environment (agriculture, water and forest fire detection), the military, buildings, health (elderly people and home monitoring), disaster relief, and area or industrial monitoring. In most WSNs tasks like processing the sensed data, making decisions and generating emergency messages are carried out by a remote server, hence the need for efficient means of transferring data across the network. Because of the range of applications and types of WSN there is a need for different kinds of MAC and routing protocols in order to guarantee delivery of data from the source nodes to the server (or sink). In order to minimize energy consumption and increase performance in areas such as reliability of data delivery, extensive research has been conducted and documented in the literature on designing energy efficient protocols for each individual layer. The most common way to conserve energy in WSNs involves using the MAC layer to put the transceiver and the processor of the sensor node into a low power, sleep state when they are not being used. Hence the energy wasted due to collisions, overhearing and idle listening is reduced. As a result of this strategy for saving energy, the routing protocols need new solutions that take into account the sleep state of some nodes, and which also enable the lifetime of the entire network to be increased by distributing energy usage between nodes over time. This could mean that a combined MAC and routing protocol could significantly improve WSNs because the interaction between the MAC and network layers lets nodes be active at the same time in order to deal with data transmission. In the research presented in this thesis, a cross-layer protocol based on MAC and routing protocols was designed in order to improve the capability of WSNs for a range of different applications. Simulation results, based on a range of realistic scenarios, show that these new protocols improve WSNs by reducing their energy consumption as well as enabling them to support mobile nodes, where necessary. A number of conference and journal papers have been published to disseminate these results for a range of applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform a statistical study of the process of orbital determination of the HD82943 extrasolar planetary system, using the current observational data set of N = 165 radial velocity (RV) measurements. Our aim is to analyse the dispersion of possible orbital fits leading to residuals compatible with the best solution, and to discuss the sensitivity of the results with respect to both the data set and the error distribution around the best fit. Although some orbital parameters (e.g. semimajor axis) appear well constrained, we show that the best fits for the HD82943 system are not robust, and at present it is not possible to estimate reliable solutions for these bodies. Finally, we discuss the possibility of a third planet, with a mass of 0.35M(Jup) and an orbital period of 900 d. Stability analysis and simulations of planetary migration indicate that such a hypothetical three-planet system could be locked in a double 2/1 mean-motion resonance, similar to the so-called Laplace resonance of the three inner Galilean satellites of Jupiter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of bivariate distributions plays a fundamental role in survival and reliability studies. In this paper, we consider a location scale model for bivariate survival times based on the proposal of a copula to model the dependence of bivariate survival data. For the proposed model, we consider inferential procedures based on maximum likelihood. Gains in efficiency from bivariate models are also examined in the censored data setting. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the bivariate regression model for matched paired survival data. Sensitivity analysis methods such as local and total influence are presented and derived under three perturbation schemes. The martingale marginal and the deviance marginal residual measures are used to check the adequacy of the model. Furthermore, we propose a new measure which we call modified deviance component residual. The methodology in the paper is illustrated on a lifetime data set for kidney patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main idea of this research to solve the problem of inventory management for the paper industry SPM PVT limited. The aim of this research was to find a methodology by which the inventory of raw material could be kept at minimum level by means of buffer stock level.The main objective then lies in finding the minimum level of buffer stock according to daily consumption of raw material, finding the Economic Order Quantity (EOQ) reorders point and how much order will be placed in a year to control the shortage of raw material.In this project, we discuss continuous review model (Deterministic EOQ models) that includes the probabilistic demand directly in the formulation. According to the formula, we see the reorder point and the order up to model. The problem was tackled mathematically as well as simulation modeling was used where mathematically tractable solution was not possible.The simulation modeling was done by Awesim software for developing the simulation network. This simulation network has the ability to predict the buffer stock level based on variable consumption of raw material and lead-time. The data collection for this simulation network is taken from the industrial engineering personnel and the departmental studies of the concerned factory. At the end, we find the optimum level of order quantity, reorder point and order days.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste trabalho e apresentar uma investigação preliminar da precisão nos resultados do sistema de localização geográfica de transmissores desenvolvido utilizando o software da rede brasileira de coleta de dados. Um conjunto de medidas de desvio Doppler de uma única passagem do satélite, considerando uma Plataforma de Coleta de Dados (PCD) e uma rede de estações de recepção terrestrês, e denominado uma rede de recepção de dados. Assim, a rede brasileira de coleta de dados com o uso de múltiplas estações de recepção permitira o incremento na quantidade de dados coletados com consequente melhora na precisão e na confiabilidade das localizações fornecidas. Consequentemente uma maior quantidade de localizações válidas e mais precisas. Os resultados e análises foram obtidos sob duas condições: na primeira foi considerada uma condição prática com dados reais e dados ideais simulados, para comparar os resultados considerando a mesma passagem do satélite, transmissor e duas estações de recepção conhecidas; na segunda foram consideradas as condições ideais simuladas a partir de medidas de um transmissor fixo, três estações de recepção e dois satélites. Os resultados utilizando a rede de recepção de dados foram bastante satisfatórios. O estudo realizado mostrou a importãncia da instalação de novas estações de recepção terrenas distribuídas no territorio nacional, para um aumento na quantidade de medidas e consequentemente uma maior quantidade de localizações válidas e mais precisas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet data collection is becoming increasingly popular in all research fields dealing with human perceptions, behaviors and opinions. Advantages of internet data collection, when compared to the traditional paper-and-pencil format, include reduced costs, automatic database creation, and the absence of researcher-related bias effects, such as availability and complete anonymity. However, the validity and reliability of internet gathered data must be established, in comparison to the usual paper-and-pencil accepted formats, before an inferential analysis can be done. In this study, we compared questionnaire data gathered from the internet with that from the traditional paper-and-pencil in a sample of college students. The questionnaires used were the Maslach Burnout Inventory - Student Survey (MBI-SS), the Oldenburg Burnout Inventory (OBI-SS) and the Copenhagen Burnout Inventory (CBI-SS). Data was gathered through a within-subject cross randomized and counterbalanced design, on both internet and paper-and-pencil formats. The results showed no interference in the application order, and a good reliability for both formats. However, concordance between answers was generally higher in the paper-and-pencil format than on the internet. The factorial structure was invariant in the three burnout inventories. Data gathered in this study supports the Internet as a convenient, user-friendly, comfortable and secure data gathering method which does not affect the accepted factorial structures existent in the paper format of the three burnout inventories used. (C) 2011 Elsevier Ltd. All rights reserved.