948 resultados para Data reliability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Awareness of optimal behaviour states of children with profound intellectual disability has been reported in the literature as a potentially useful tool for planning intervention within this population. Some arguments have been raised, however, which question the reliability and validity of previously published work on behaviour state analysis. This article sheds light on the debate by presenting two stages of a study of behaviour state analysis for eight girls with Rett syndrome. The results support Mudford, Hogg, and Roberts' (1997, 1999) concerns with the pooling of participant data. The results of Stage 2 also suggest, however, that most categories of behaviour state can be reliably distinguished once definitions of behaviours for each state are clearly defined.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background. Through a national policy agreement, over 167 million Euros will be invested in the Swedish National Quality Registries (NQRs) between 2012 and 2016. One of the policy agreement¿s intentions is to increase the use of NQR data for quality improvement (QI). However, the evidence is fragmented as to how the use of medical registries and the like lead to quality improvement, and little is known about non-clinical use. The aim was therefore to investigate the perspectives of Swedish politicians and administrators on quality improvement based on national registry data. Methods. Politicians and administrators from four county councils were interviewed. A qualitative content analysis guided by the Consolidated Framework for Implementation Research (CFIR) was performed. Results. The politicians and administrators perspectives on the use of NQR data for quality improvement were mainly assigned to three of the five CFIR domains. In the domain of intervention characteristics, data reliability and access in reasonable time were not considered entirely satisfactory, making it difficult for the politico-administrative leaderships to initiate, monitor, and support timely QI efforts. Still, politicians and administrators trusted the idea of using the NQRs as a base for quality improvement. In the domain of inner setting, the organizational structures were not sufficiently developed to utilize the advantages of the NQRs, and readiness for implementation appeared to be inadequate for two reasons. Firstly, the resources for data analysis and quality improvement were not considered sufficient at politico-administrative or clinical level. Secondly, deficiencies in leadership engagement at multiple levels were described and there was a lack of consensus on the politicians¿ role and level of involvement. Regarding the domain of outer setting, there was a lack of communication and cooperation between the county councils and the national NQR organizations. Conclusions. The Swedish experiences show that a government-supported national system of well-funded, well-managed, and reputable national quality registries needs favorable local politico-administrative conditions to be used for quality improvement; such conditions are not yet in place according to local politicians and administrators.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the deregulated Power markets it is necessary to have a appropriate Transmission Pricing methodology that also takes into account “Congestion and Reliability”, in order to ensure an economically viable, equitable, and congestion free power transfer capability, with high reliability and security. This thesis presents results of research conducted on the development of a Decision Making Framework (DMF) of concepts and data analytic and modelling methods for the Reliability benefits Reflective Optimal “cost evaluation for the calculation of Transmission Cost” for composite power systems, using probabilistic methods. The methodology within the DMF devised and reported in this thesis, utilises a full AC Newton-Raphson load flow and a Monte-Carlo approach to determine, Reliability Indices which are then used for the proposed Meta-Analytical Probabilistic Approach (MAPA) for the evaluation and calculation of the Reliability benefit Reflective Optimal Transmission Cost (ROTC), of a transmission system. This DMF includes methods for transmission line embedded cost allocation among transmission transactions, accounting for line capacity-use as well as congestion costing that can be used for pricing using application of Power Transfer Distribution Factor (PTDF) as well as Bialek’s method to determine a methodology which consists of a series of methods and procedures as explained in detail in the thesis for the proposed MAPA for ROTC. The MAPA utilises the Bus Data, Generator Data, Line Data, Reliability Data and Customer Damage Function (CDF) Data for the evaluation of Congestion, Transmission and Reliability costing studies using proposed application of PTDF and other established/proven methods which are then compared, analysed and selected according to the area/state requirements and then integrated to develop ROTC. Case studies involving standard 7-Bus, IEEE 30-Bus and 146-Bus Indian utility test systems are conducted and reported throughout in the relevant sections of the dissertation. There are close correlation between results obtained through proposed application of PTDF method with the Bialek’s and different MW-Mile methods. The novel contributions of this research work are: firstly the application of PTDF method developed for determination of Transmission and Congestion costing, which are further compared with other proved methods. The viability of developed method is explained in the methodology, discussion and conclusion chapters. Secondly the development of comprehensive DMF which helps the decision makers to analyse and decide the selection of a costing approaches according to their requirements. As in the DMF all the costing approaches have been integrated to achieve ROTC. Thirdly the composite methodology for calculating ROTC has been formed into suits of algorithms and MATLAB programs for each part of the DMF, which are further described in the methodology section. Finally the dissertation concludes with suggestions for Future work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJETIVO: Conhecer a qualidade dos dados de internação por causas externas em São José dos Campos, São Paulo. MÉTODO: Foram estudadas as internações pelo Sistema Único de Saúde por lesões decorrentes de causas externas no primeiro semestre de 2003, no Hospital Municipal, referência para o atendimento ao trauma no Município, por meio da comparação dos dados registrados no Sistema de Informações Hospitalares com os prontuários de 990 internações. A concordância das variáveis relativas à vítima, à internação e ao agravo foi avaliada pela taxa bruta de concordância e pelo coeficiente Kappa. As lesões e as causas externas foram codificadas segundo a 10ª revisão da Classificação Internacional de Doenças, respectivamente, capítulos XIX e XX. RESULTADOS: A taxa de concordância bruta foi de boa qualidade para as variáveis relativas à vítima e à internação, variando de 89,0% a 99,2%. As lesões tiveram concordância ótima, exceto os traumatismos do pescoço (k=0,73), traumatismos múltiplos (k=0,67) e fraturas do tórax (k=0,49). As causas externas tiveram concordância ótima para acidentes de transporte (k=0,90) e quedas (k=0,83). A confiabilidade foi menor para agressões (k=0,50), causas indeterminadas (k=0,37), e complicações da assistência médica (k=0,03). Houve concordância ótima nos acidentes de transporte em pedestres, ciclistas e motociclistas. CONCLUSÃO: A maioria das variáveis de estudo teve boa qualidade no nível de agregação analisado. Algumas variáveis relativas à vítima e alguns tipos de causas externas necessitam de aperfeiçoamento da qualidade dos dados. O perfil da morbidade hospitalar encontrado confirmou os acidentes de transporte como importante causa externa de internação hospitalar no Município.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In bottom-up proteomics, rapid and efficient protein digestion is crucial for data reliability. However, sample preparation remains one of the rate-limiting steps in proteomics workflows. In this study, we compared the conventional trypsin digestion procedure with two accelerated digestion protocols based on shorter reaction times and microwave-assisted digestion for the preparation of membrane-enriched protein fractions of the human pathogenic bacterium Staphylococcus aureus. Produced peptides were analyzed by Shotgun IPG-IEF, a methodology relying on separation of peptides by IPG-IEF before the conventional LC-MS/MS steps of shotgun proteomics. Data obtained on two LC-MS/MS platforms showed that accelerated digestion protocols, especially the one relying on microwave irradiation, enhanced the cleavage specificity of trypsin and thus improved the digestion efficiency especially for hydrophobic and membrane proteins. The combination of high-throughput proteomics with accelerated and efficient sample preparation should enhance the practicability of proteomics by reducing the time from sample collection to obtaining the results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work aims to determine a better methodology to help predicting some operational parameters to a new design of mixer-settler on treating wastewater produced by petroleum industry, called MDIF (Misturador-Decantador à Inversão de Fases/ Mixer-Settler based on Phase Inversion MSPI). The data from this research were obtained from the wastewater treatment unit, called MSPI-TU, installed on a wastewater treatment plant (WTP) of PETROBRAS/UO-RNCE. The importance in determining the better methodology to predict the results of separation and extraction efficiency of the equipment, contributes significantly to determine the optimum operating variables for the control of the unit. The study was based on a comparison among the experimental efficiency (E) obtained by operating MSPI-TU, the efficiency obtained by experimental design equation (Eplan) from the software Statistica Experimental Design® (version 7.0), and the other obtained from a modeling equation based on a dimensional analysis (Ecalc). The results shows that the experimental design equation gives a good prediction of the unit efficiencies with better data reliability, regarding to the condition before a run operation. The average deviation between the proposed by statistic planning model equation and experimental data was 0.13%. On the other hand, the efficiency calculated by the equation which represents the dimensional analysis, may result on important relative deviations (up 70%). Thus, the experimental design is confirmed as a reliable tool, with regard the experimental data processing of the MSPI-TU

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O objetivo desse estudo foi verificar o momento com maior fidedignidade de dados do processo de avaliação da linguagem, para realizar o levantamento do perfil pragmático infantil. Participaram cinco crianças, com desenvolvimento típico de linguagem, e idades entre 7 anos e 1 mês e 8 anos e 11 meses. Foram realizados 150 minutos de gravação, em uma situação de interação da criança com a pesquisadora, divididas em cinco sessões individuais de 30 minutos. Houve análise posterior dos dados, segundo o protocolo de habilidades comunicativas verbais (HCV), sendo delineado o perfil pragmático individual de cada filmagem (30 minutos) e de toda a amostra (150 minutos), para a comparação (sessões 1 a 5 x total geral das sessões) dos índices de fidedignidade (IF) e status de confiabilidade (SC). Para o cálculo do IF e do SC, respectivamente, foram realizadas as análises individuais interobservador e intraobservador. Os resultados apresentados pelas crianças 1 e 2 alcançaram maior IF na sessão 2; os da criança 3 apresentaram valores semelhantes de IF nas sessões 3, 4 e 5; os da criança 4 obtiveram o maior IF nas sessões 1 e 3; e os da criança 5 alcançaram o mesmo valor de IF em todas sessões. Com relação ao SC, a sessão 2 apresentou maior porcentagem de altíssima confiabilidade para a maioria das crianças, seguida da sessão 3. Na análise realizada por categoria de HCV, a sessão 3 apresentou maior SC para as habilidades dialógicas, narrativo-discursivas e total geral de HCV. No geral, observa-se que as sessões 2 e 3 foram as que permitiram alcançar maior IF e SC na análise realizada para delineamento do perfil pragmático infantil.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this report a new automated optical test for next generation of photonic integrated circuits (PICs) is provided by the test-bed design and assessment. After a briefly analysis of critical problems of actual optical tests, the main test features are defined: automation and flexibility, relaxed alignment procedure, speed up of entire test and data reliability. After studying varied solutions, the test-bed components are defined to be lens array, photo-detector array, and software controller. Each device is studied and calibrated, the spatial resolution, and reliability against interference at the photo-detector array are studied. The software is programmed in order to manage both PIC input, and photo-detector array output as well as data analysis. The test is validated by analysing state-of-art 16 ports PIC: the waveguide location, current versus power, and time-spatial power distribution are measured as well as the optical continuity of an entire path of PIC. Complexity, alignment tolerance, time of measurement are also discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hydrographers have traditionally referred to the nearshore area as the "white ribbon" area due to the challenges associated with the collection of elevation data in this highly dynamic transitional zone between terrestrial and marine environments. Accordingly, available information in this zone is typically characterised by a range of datasets from disparate sources. In this paper we propose a framework to 'fill' the white ribbon area of a coral reef system by integrating multiple elevation and bathymetric datasets acquired by a suite of remote-sensing technologies into a seamless digital elevation model (DEM). A range of datasets are integrated, including field-collected GPS elevation points, terrestrial and bathymetric LiDAR, single and multibeam bathymetry, nautical chart depths and empirically derived bathymetry estimations from optical remote sensing imagery. The proposed framework ranks data reliability internally, thereby avoiding the requirements to quantify absolute error and results in a high resolution, seamless product. Nested within this approach is an effective spatially explicit technique for improving the accuracy of bathymetry estimates derived empirically from optical satellite imagery through modelling the spatial structure of residuals. The approach was applied to data collected on and around Lizard Island in northern Australia. Collectively, the framework holds promise for filling the white ribbon zone in coastal areas characterised by similar data availability scenarios. The seamless DEM is referenced to the horizontal coordinate system MGA Zone 55 - GDA 1994, mean sea level (MSL) vertical datum and has a spatial resolution of 20 m.