948 resultados para Data reliability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pierre Auger Cosmic Ray Observatory North site employs a large array of surface detector stations (tanks) to detect the secondary particle showers generated by ultra-high energy cosmic rays. Due to the rare nature of ultra-high energy cosmic rays, it is important to have a high reliability on tank communications, ensuring no valuable data is lost. The Auger North site employs a peer-to-peer paradigm, the Wireless Architecture for Hard Real-Time Embedded Networks (WAHREN), designed specifically for highly reliable message delivery over fixed networks, under hard real-time deadlines. The WAHREN design included two retransmission protocols, Micro- and Macro- retransmission. To fully understand how each retransmission protocol increased the reliability of communications, this analysis evaluated the system without using either retransmission protocol (Case-0), both Micro- and Macro-retransmission individually (Micro and Macro), and Micro- and Macro-retransmission combined. This thesis used a multimodal modeling methodology to prove that a performance and reliability analysis of WAHREN was possible, and provided the results of the analysis. A multimodal approach was necessary because these processes were driven by different mathematical models. The results from this analysis can be used as a framework for making design decisions for the Auger North communication system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My study investigated internal consistency estimates of psychometric surveys as an operationalization of the state of measurement precision of constructs in industrial and organizational (I/O) psychology. Analyses were conducted of samples used in research articles published in the Journal of Applied Psychology between 1975 and 2010 in five year intervals (K = 934) from 480 articles yielding 1427 coefficients. Articles and their respective samples were coded for test-taker characteristics (e.g., age, gender, and ethnicity), research settings (e.g., lab and field studies), and actual tests (e.g., number of items and scale anchor points). A reliability and inter-item correlations depository was developed for I/O variables and construct groups. Personality measures had significantly lower inter-item correlations than other construct groups. Also, internal consistency estimates and reporting practices were evaluated over time, demonstrating an improvement in measurement precision and missing data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the 380V DC and 48V DC distribution systems have been extensively studied for the latest data centers. It is widely believed that the 380V DC system is a very promising candidate because of its lower cable cost compared to the 48V DC system. However, previous studies have not adequately addressed the low reliability issue with the 380V DC systems due to large amount of series connected batteries. In this thesis, a quantitative comparison for the two systems has been presented in terms of efficiency, reliability and cost. A new multi-port DC UPS with both high voltage output and low voltage output is proposed. When utility ac is available, it delivers power to the load through its high voltage output and charges the battery through its low voltage output. When utility ac is off, it boosts the low battery voltage and delivers power to the load form the battery. Thus, the advantages of both systems are combined and the disadvantages of them are avoided. High efficiency is also achieved as only one converter is working in either situation. Details about the design and analysis of the new UPS are presented. For the main AC-DC part of the new UPS, a novel bridgeless three-level single-stage AC-DC converter is proposed. It eliminates the auxiliary circuit for balancing the capacitor voltages and the two bridge rectifier diodes in previous topology. Zero voltage switching, high power factor, and low component stresses are achieved with this topology. Compared to previous topologies, the proposed converter has a lower cost, higher reliability, and higher efficiency. The steady state operation of the converter is analyzed and a decoupled model is proposed for the converter. For the battery side converter as a part of the new UPS, a ZVS bidirectional DC-DC converter based on self-sustained oscillation control is proposed. Frequency control is used to ensure the ZVS operation of all four switches and phase shift control is employed to regulate the converter output power. Detailed analysis of the steady state operation and design of the converter are presented. Theoretical, simulation, and experimental results are presented to verify the effectiveness of the proposed concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the rapid changes that governs the Swedish financial sector such as financial deregulations and technological innovations, it is imperative to examine the extent to which the Swedish Financial institutions had performed amid these changes. For this to be accomplish, the work investigates what are the determinants of performance for Swedish Financial Monetary Institutions? Assumptions were derived from theoretical and empirical literatures to investigate the authenticity of this research question using seven explanatory variables. Two models were specified using Returns on Asset (ROA) and Return on Equity (ROE) as the main performance indicators and for the sake of reliability and validity, three different estimators such as Ordinary Least Square (OLS), Generalized Least Square (GLS) and Feasible Generalized Least Square (FGLS) were employed. The Akaike Information Criterion (AIC) was also used to verify which specification explains performance better while performing robustness check of parameter estimates was done by correcting for standard errors. Based on the findings, ROA specification proves to have the lowest Akaike Information Criterion (AIC) and Standard errors compared to ROE specification. Under ROA, two variables; the profit margins and the Interest coverage ratio proves to be statistically significant while under ROE just the interest coverage ratio (ICR) for all the estimators proves significant. The result also shows that the FGLS is the most efficient estimator, then follows the GLS and the last OLS. when corrected for SE robust, the gearing ratio which measures the capital structure becomes significant under ROA and its estimate become positive under ROE robust. Conclusions were drawn that, within the period of study three variables (ICR, profit margins and gearing) shows significant and four variables were insignificant. The overall findings show that the institutions strive to their best to maximize returns but these returns were just normal to cover their costs of operation. Much should be done as per the ASC theory to avoid liquidity and credit risks problems. Again, estimated values of ICR and profit margins shows that a considerable amount of efforts with sound financial policies are required to increase performance by one percentage point. Areas of further research could be how the individual stochastic factors such as the Dupont model, repo rates, inflation, GDP etc. can influence performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International FItness Scale (IFIS) is a self-reported measure of physical fitness that could easily. This scale has been validated in children, adolescents, and young adults; however, it is unknown whether the IFIS represents a valid and reliable estimate of physical fitness in Latino-American youth population. In the present study we aimed to examine the validity and reliability of the IFIS on a population-based sample of schoolchildren in Bogota, Colombia. Participants were 1,875 Colombian youth (56.2% girls) aged 9 to 17.9 years old. We measured adiposity markers (body fat, waist-to-height ratio, skinfold thicknesses and BMI), blood pressure, lipids profile, fasting glucose, and physical fitness level (self reported and measured). Also, a validated cardiometabolic risk index was used. An age- and sex-matched sample of 229 Schoolchildren originally not included in the study sample fulfilled IFIS twice for reliability purposes. Our data suggest that both measured and self-reported overall fitness were associated inversely with adiposity indicators and a cardiometabolic risk score. Overall, schoolchildren who self-reported “good” and “very good” fitness had better measured fitness than those who reported “very poor” and “poor” fitness (all p<0.001). Test–retest reliability of IFIS items was also good, with an average weighted Kappa of 0.811. Therefore, our findings suggest that self-reported fitness, as assessed by IFIS, is a valid, reliable, and health-related measure, and it can be a good alternative for future use in large studies with Latin-schoolchildren from Colombia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study is to provide validation data regarding the Portuguese version of the Suicidal Behaviors Questionnaire Revised in nonclinical individuals. Two studies were undertaken with two different nonclinical samples in order to demonstrate reliability, concurrent, predictive, and construct validity, and in order to establish an appropriate cut-score for nonclinical individuals. A sample of 810 community adults participated in Study 1. Results from this study provided information regarding scale internal consistency, exploratory and confirmatory factor analysis, and concurrent validity. Receiver operating characteristic curve analysis established a cut-off score to be used for screening purposes with nonclinical individuals. A sample of 440 young adults participated in Study 2, which demonstrated scale score internal consistency and 5-month predictive validity. Further, 5-month test-retest reliability was also evaluated and the correlations of SBQ-R scale scores with two other measures that assess constructs related to suicidality, depression and psychache, were also performed. In addition, confirmatory factor analysis was undertaken to demonstrate the robustness of the result obtained in Study 1. Overall, findings supported the psychometric appropriateness of the Portuguese Suicidal Behaviors Questionnaire-Revise

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring agricultural crops constitutes a vital task for the general understanding of land use spatio-temporal dynamics. This paper presents an approach for the enhancement of current crop monitoring capabilities on a regional scale, in order to allow for the analysis of environmental and socio-economic drivers and impacts of agricultural land use. This work discusses the advantages and current limitations of using 250m VI data from the Moderate Resolution Imaging Spectroradiometer (MODIS) for this purpose, with emphasis in the difficulty of correctly analyzing pixels whose temporal responses are disturbed due to certain sources of interference such as mixed or heterogeneous land cover. It is shown that the influence of noisy or disturbed pixels can be minimized, and a much more consistent and useful result can be attained, if individual agricultural fields are identified and each field's pixels are analyzed in a collective manner. As such, a method is proposed that makes use of image segmentation techniques based on MODIS temporal information in order to identify portions of the study area that agree with actual agricultural field borders. The pixels of each portion or segment are then analyzed individually in order to estimate the reliability of the temporal signal observed and the consequent relevance of any estimation of land use from that data. The proposed method was applied in the state of Mato Grosso, in mid-western Brazil, where extensive ground truth data was available. Experiments were carried out using several supervised classification algorithms as well as different subsets of land cover classes, in order to test the methodology in a comprehensive way. Results show that the proposed method is capable of consistently improving classification results not only in terms of overall accuracy but also qualitatively by allowing a better understanding of the land use patterns detected. It thus provides a practical and straightforward procedure for enhancing crop-mapping capabilities using temporal series of moderate resolution remote sensing data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collecting ground truth data is an important step to be accomplished before performing a supervised classification. However, its quality depends on human, financial and time ressources. It is then important to apply a validation process to assess the reliability of the acquired data. In this study, agricultural infomation was collected in the Brazilian Amazonian State of Mato Grosso in order to map crop expansion based on MODIS EVI temporal profiles. The field work was carried out through interviews for the years 2005-2006 and 2006-2007. This work presents a methodology to validate the training data quality and determine the optimal sample to be used according to the classifier employed. The technique is based on the detection of outlier pixels for each class and is carried out by computing Mahalanobis distances for each pixel. The higher the distance, the further the pixel is from the class centre. Preliminary observations through variation coefficent validate the efficiency of the technique to detect outliers. Then, various subsamples are defined by applying different thresholds to exclude outlier pixels from the classification process. The classification results prove the robustness of the Maximum Likelihood and Spectral Angle Mapper classifiers. Indeed, those classifiers were insensitive to outlier exclusion. On the contrary, the decision tree classifier showed better results when deleting 7.5% of pixels in the training data. The technique managed to detect outliers for all classes. In this study, few outliers were present in the training data, so that the classification quality was not deeply affected by the outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been exponential growth in using virtual spaces, including dialogue systems, that handle personal information. The concept of personal privacy in the literature is discussed and controversial, whereas, in the technological field, it directly influences the degree of reliability perceived in the information system (privacy ‘as trust’). This work aims to protect the right to privacy on personal data (GDPR, 2018) and avoid the loss of sensitive content by exploring sensitive information detection (SID) task. It is grounded on the following research questions: (RQ1) What does sensitive data mean? How to define a personal sensitive information domain? (RQ2) How to create a state-of-the-art model for SID?(RQ3) How to evaluate the model? RQ1 theoretically investigates the concepts of privacy and the ontological state-of-the-art representation of personal information. The Data Privacy Vocabulary (DPV) is the taxonomic resource taken as an authoritative reference for the definition of the knowledge domain. Concerning RQ2, we investigate two approaches to classify sensitive data: the first - bottom-up - explores automatic learning methods based on transformer networks, the second - top-down - proposes logical-symbolic methods with the construction of privaframe, a knowledge graph of compositional frames representing personal data categories. Both approaches are tested. For the evaluation - RQ3 – we create SPeDaC, a sentence-level labeled resource. This can be used as a benchmark or training in the SID task, filling the gap of a shared resource in this field. If the approach based on artificial neural networks confirms the validity of the direction adopted in the most recent studies on SID, the logical-symbolic approach emerges as the preferred way for the classification of fine-grained personal data categories, thanks to the semantic-grounded tailor modeling it allows. At the same time, the results highlight the strong potential of hybrid architectures in solving automatic tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hadrontherapy is a medical treatment based on the use of charged particles beams accelerated towards deep-seated tumors on clinical patients. The reason why it is increasingly used is the favorable depth dose profile following the Bragg Peak distribution, where the release of dose is almost sharply focused near the end of the beam path. However, nuclear interactions between the beam and the human body constituents occur, generating nuclear fragments which modify the dose profile. To overcome the lack of experimental data on nuclear fragmentation reactions in the energy range of hadrontherapy interest, the FOOT (FragmentatiOn Of Target) experiment has been conceived with the main aim of measuring differential nuclear fragmentation cross sections with an uncertainty lower than 5\%. The same results are of great interest also in the radioprotection field, studying similar processes. Long-term human missions outside the Earth’s orbit are going to be planned in the next years, among which the NASA foreseen travel to Mars, and it is fundamental to protect astronauts health and electronics from radiation exposure .\\ In this thesis, a first analysis of the data taken at the GSI with a beam of $^{16}O$ at 400 $MeV/u$ impinging on a target of graphite ($C$) will be presented, showing the first preliminary results of elemental cross section and angular differential cross section. A Monte Carlo dataset was first studied to test the performance of the tracking reconstruction algorithm and to check the reliability of the full analysis chain, from hit reconstruction to cross section measurement. An high agreement was found between generated and reconstructed fragments, thus validating the adopted procedure. A preliminary experimental cross section was measured and compared with MC results, highlighting a good consistency for all the fragments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article seeks to investigate patterns of performance and relationships between grip strength, gait speed and self-rated health, and investigate the relationships between them, considering the variables of gender, age and family income. This was conducted in a probabilistic sample of community-dwelling elderly aged 65 and over, members of a population study on frailty. A total of 689 elderly people without cognitive deficit suggestive of dementia underwent tests of gait speed and grip strength. Comparisons between groups were based on low, medium and high speed and strength. Self-related health was assessed using a 5-point scale. The males and the younger elderly individuals scored significantly higher on grip strength and gait speed than the female and oldest did; the richest scored higher than the poorest on grip strength and gait speed; females and men aged over 80 had weaker grip strength and lower gait speed; slow gait speed and low income arose as risk factors for a worse health evaluation. Lower muscular strength affects the self-rated assessment of health because it results in a reduction in functional capacity, especially in the presence of poverty and a lack of compensatory factors.