159 resultados para Data reliability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As estatísticas de mortalidade são usadas em epidemiologia e saúde pública como indicador de nível de saúde, em avaliações de programas de saúde e em estudos populacionais visando a comparar tendências temporais e diferenças geográficas. Uma das variáveis utilizadas nesse tipo de análise é a causa básica da morte. Entretanto, existem críticas quanto à qualidade das estatísticas baseadas nas causas de morte declaradas pelos médicos nos atestados de óbito. O objetivo deste artigo é refletir sobre a fidedignidade das causas de morte declaradas pelos médicos nos atestados de óbito, com base em estudos realizados segundo diferentes metodologias, e comenta a validade das estatísticas de mortalidade segundo causas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Avaliar a fidedignidade das informações sobre dados nutricionais declarados em rótulos de alimentos comercializados. MÉTODOS: Foram avaliados 153 alimentos industrializados habitualmente consumidos por crianças e adolescentes, comercializados no município de São Paulo (SP) entre os anos de 2001 e 2005. Os teores de nutrientes informados pelos rótulos foram confrontados com os resultados obtidos por métodos analíticos (físico-químicos) oficiais, considerando a variabilidade de 20% tolerada pela legislação vigente, para aprovar ou condenar as amostras. Foram calculadas médias, desvios-padrão e intervalos com 95% de confiança para os nutrientes analisados, assim como a distribuição da freqüência percentual de amostras condenadas. RESULTADOS: Todos os produtos salgados analisados apresentaram inconformidades relativamente ao conteúdo de fibra alimentar, sódio ou de gorduras saturadas. Os produtos doces apresentaram variação de zero a 36% de condenação relativamente ao teor de fibra alimentar. Mais da metade (52%) dos biscoitos recheados foram condenados quanto à quantidade de gorduras saturadas. Os nutrientes implicados com a obesidade e suas complicações para a saúde foram aqueles que apresentaram maiores proporções de inconformidade. A falta de fidedignidade das informações de rótulos nas amostras analisadas viola as disposições da Resolução da Diretoria Colegiada 360/03 da ANVISA e os direitos garantidos pela lei de Segurança Alimentar e Nutricional e pelo Código de Defesa do Consumidor. CONCLUSÕES: Foram encontrados altos índices de não conformidade dos dados nutricionais nos rótulos de alimentos destinados ao público adolescente e infantil, indicando a urgência de ações de fiscalização e de outras medidas de rotulagem nutricional

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is no doubt about the importance of service quality as a factor of businesses' success, but to measure this quality has proved to be a challenge when one considers different environmental contexts. Given this, the main goal of this paper was to test two measurement scales of perceived set-vice quality. The comparison between Service Quality scale (Servqual) and Retail Set-vice Quality (RSQ) was conducted by means of a survey with 351 participants, clients of a home center stores chain located in the city of Sao Paulo. The data were analyzed using both exploratory and confirmatory factor analysis. As a result, both scales demonstrated acceptable levels of reliability and validity However; the RSQ demonstrated a better performance in the nomological test since it was able to explain 43% of the loyalty towards the retailer; while the Servqual scale explained only 11%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In some circumstances ice floes may be modeled as beams. In general this modeling supposes constant thickness, which contradicts field observations. Action of currents, wind and the sequence of contacts, causes thickness to vary. Here this effect is taken into consideration on the modeling of the behavior of ice hitting inclined walls of offshore platforms. For this purpose, the boundary value problem is first equated. The set of equations so obtained is then transformed into a system of equations, that is then solved numerically. For this sake an implicit solution is developed, using a shooting method, with the accompanying Jacobian. In-plane coupling and the dependency of the boundary terms on deformation, make the problem non-linear and the development particular. Deformation and internal resultants are then computed for harmonic forms of beam profile. Forms of giving some additional generality to the problem are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostic methods have been an important tool in regression analysis to detect anomalies, such as departures from error assumptions and the presence of outliers and influential observations with the fitted models. Assuming censored data, we considered a classical analysis and Bayesian analysis assuming no informative priors for the parameters of the model with a cure fraction. A Bayesian approach was considered by using Markov Chain Monte Carlo Methods with Metropolis-Hasting algorithms steps to obtain the posterior summaries of interest. Some influence methods, such as the local influence, total local influence of an individual, local influence on predictions and generalized leverage were derived, analyzed and discussed in survival data with a cure fraction and covariates. The relevance of the approach was illustrated with a real data set, where it is shown that, by removing the most influential observations, the decision about which model best fits the data is changed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a nontrivial one-species population dynamics model with finite and infinite carrying capacities. Time-dependent intrinsic and extrinsic growth rates are considered in these models. Through the model per capita growth rate we obtain a heuristic general procedure to generate scaling functions to collapse data into a simple linear behavior even if an extrinsic growth rate is included. With this data collapse, all the models studied become independent from the parameters and initial condition. Analytical solutions are found when time-dependent coefficients are considered. These solutions allow us to perceive nontrivial transitions between species extinction and survival and to calculate the transition's critical exponents. Considering an extrinsic growth rate as a cancer treatment, we show that the relevant quantity depends not only on the intensity of the treatment, but also on when the cancerous cell growth is maximum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we analyzed the topography of induced cortical oscillations in 20 healthy individuals performing simple attention tasks. We were interested in qualitatively replicating our recent findings on the localization of attention-induced beta bands during a visual task [1], and verifying whether significant topographic changes would follow the change of attention to the auditory modality. We computed corrected latency averaging of each induced frequency bands, and modeled their generators by current density reconstruction with Lp-norm minimization. We quantified topographic similarity between conditions by an analysis of correlations, whereas the inter-modality significant differences in attention correlates were illustrated in each individual case. We replicated the qualitative result of highly idiosyncratic topography of attention-related activity to individuals, manifested both in the beta bands, and previously studied slow potential distributions [2]. Visual inspection of both scalp potentials and distribution of cortical currents showed minor changes in attention-related bands with respect to modality, as compared to the theta and delta bands, known to be major contributors to the sensory-related potentials. Quantitative results agreed with visual inspection, supporting to the conclusion that attention-related activity does not change much between modalities, and whatever individual changes do occur, they are not systematic in cortical localization across subjects. We discuss our results, combined with results from other studies that present individual data, with respect to the function of cortical association areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Work disability is a major consequence of rheumatoid arthritis (RA), associated not only with traditional disease activity variables, but also more significantly with demographic, functional, occupational, and societal variables. Recent reports suggest that the use of biologic agents offers potential for reduced work disability rates, but the conclusions are based on surrogate disease activity measures derived from studies primarily from Western countries. Methods: The Quantitative Standard Monitoring of Patients with RA (QUEST-RA) multinational database of 8,039 patients in 86 sites in 32 countries, 16 with high gross domestic product (GDP) (>24K US dollars (USD) per capita) and 16 low-GDP countries (<11K USD), was analyzed for work and disability status at onset and over the course of RA and clinical status of patients who continued working or had stopped working in high-GDP versus low-GDP countries according to all RA Core Data Set measures. Associations of work disability status with RA Core Data Set variables and indices were analyzed using descriptive statistics and regression analyses. Results: At the time of first symptoms, 86% of men (range 57%-100% among countries) and 64% (19%-87%) of women <65 years were working. More than one third (37%) of these patients reported subsequent work disability because of RA. Among 1,756 patients whose symptoms had begun during the 2000s, the probabilities of continuing to work were 80% (95% confidence interval (CI) 78%-82%) at 2 years and 68% (95% CI 65%-71%) at 5 years, with similar patterns in high-GDP and low-GDP countries. Patients who continued working versus stopped working had significantly better clinical status for all clinical status measures and patient self-report scores, with similar patterns in high-GDP and low-GDP countries. However, patients who had stopped working in high-GDP countries had better clinical status than patients who continued working in low-GDP countries. The most significant identifier of work disability in all subgroups was Health Assessment Questionnaire (HAQ) functional disability score. Conclusions: Work disability rates remain high among people with RA during this millennium. In low-GDP countries, people remain working with high levels of disability and disease activity. Cultural and economic differences between societies affect work disability as an outcome measure for RA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to determine if performing isometric 3-point kneeling exercises on a Swiss ball influenced the isometric force output and EMG activities of the shoulder muscles when compared with performing the same exercises on a stable base of support. Twenty healthy adults performed the isometric 3-point kneeling exercises with the hand placed either on a stable surface or on a Swiss ball. Surface EMG was recorded from the posterior deltoid, pectoralis major, biceps brachii, triceps brachii, upper trapezius, and serratus anterior muscles using surface differential electrodes. All EMG data were reported as percentages of the average root mean square (RMS) values obtained in maximum voluntary contractions for each muscle studied. The highest load value was obtained during exercise on a stable surface. A significant increase was observed in the activation of glenohumeral muscles during exercises on a Swiss ball. However, there were no differences in EMG activities of the scapulothoracic muscles. These results suggest that exercises performed on unstable surfaces may provide muscular activity levels similar to those performed on stable surfaces, without the need to apply greater external loads to the musculoskeletal system. Therefore, exercises on unstable surfaces may be useful during the process of tissue regeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Melanoma is a highly aggressive and therapy resistant tumor for which the identification of specific markers and therapeutic targets is highly desirable. We describe here the development and use of a bioinformatic pipeline tool, made publicly available under the name of EST2TSE, for the in silico detection of candidate genes with tissue-specific expression. Using this tool we mined the human EST (Expressed Sequence Tag) database for sequences derived exclusively from melanoma. We found 29 UniGene clusters of multiple ESTs with the potential to predict novel genes with melanoma-specific expression. Using a diverse panel of human tissues and cell lines, we validated the expression of a subset of three previously uncharacterized genes (clusters Hs.295012, Hs.518391, and Hs.559350) to be highly restricted to melanoma/melanocytes and named them RMEL1, 2 and 3, respectively. Expression analysis in nevi, primary melanomas, and metastatic melanomas revealed RMEL1 as a novel melanocytic lineage-specific gene up-regulated during melanoma development. RMEL2 expression was restricted to melanoma tissues and glioblastoma. RMEL3 showed strong up-regulation in nevi and was lost in metastatic tumors. Interestingly, we found correlations of RMEL2 and RMEL3 expression with improved patient outcome, suggesting tumor and/or metastasis suppressor functions for these genes. The three genes are composed of multiple exons and map to 2q12.2, 1q25.3, and 5q11.2, respectively. They are well conserved throughout primates, but not other genomes, and were predicted as having no coding potential, although primate-conserved and human-specific short ORFs could be found. Hairpin RNA secondary structures were also predicted. Concluding, this work offers new melanoma-specific genes for future validation as prognostic markers or as targets for the development of therapeutic strategies to treat melanoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: High-throughput molecular approaches for gene expression profiling, such as Serial Analysis of Gene Expression (SAGE), Massively Parallel Signature Sequencing (MPSS) or Sequencing-by-Synthesis (SBS) represent powerful techniques that provide global transcription profiles of different cell types through sequencing of short fragments of transcripts, denominated sequence tags. These techniques have improved our understanding about the relationships between these expression profiles and cellular phenotypes. Despite this, more reliable datasets are still necessary. In this work, we present a web-based tool named S3T: Score System for Sequence Tags, to index sequenced tags in accordance with their reliability. This is made through a series of evaluations based on a defined rule set. S3T allows the identification/selection of tags, considered more reliable for further gene expression analysis. Results: This methodology was applied to a public SAGE dataset. In order to compare data before and after filtering, a hierarchical clustering analysis was performed in samples from the same type of tissue, in distinct biological conditions, using these two datasets. Our results provide evidences suggesting that it is possible to find more congruous clusters after using S3T scoring system. Conclusion: These results substantiate the proposed application to generate more reliable data. This is a significant contribution for determination of global gene expression profiles. The library analysis with S3T is freely available at http://gdm.fmrp.usp.br/s3t/.S3T source code and datasets can also be downloaded from the aforementioned website.