875 resultados para Information quality
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
Research into information quality on the internet, in particular on websites, has become increasingly important in recent years. In this paper a research project is described in which a measurement instrument was developed that enables the information quality of websites to be determined and analyzed from the customer perspective. The measurement instrument was developed in several stages and on the basis of a methodical-theoretical approach. In a first step, previous research results and measurement instruments were systematically analyzed. In a second step, these results were adjusted and supplemented on the basis of a qualitative study. A quantitative test of the measurement instrument is planned.
Resumo:
Knowledge is central to the modern economy and society. Indeed, the knowledge society has transformed the concept of knowledge and is more and more aware of the need to overcome the lack of knowledge when has to make options or address its problems and dilemmas. One’s knowledge is less based on exact facts and more on hypotheses, perceptions or indications. Even when we use new computational artefacts and novel methodologies for problem solving, like the use of Group Decision Support Systems (GDSSs), the question of incomplete information is in most of the situations marginalized. On the other hand, common sense tells us that when a decision is made it is impossible to have a perception of all the information involved and the nature of its intrinsic quality. Therefore, something has to be made in terms of the information available and the process of its evaluation. It is under this framework that a Multi-valued Extended Logic Programming language will be used for knowledge representation and reasoning, leading to a model that embodies the Quality-of-Information (QoI) and its quantification, along the several stages of the decision-making process. In this way, it is possible to provide a measure of the value of the QoI that supports the decision itself. This model will be here presented in the context of a GDSS for VirtualECare, a system aimed at sustaining online healthcare services.
Resumo:
INTRODUCTION: The aim of the study was to assess the quality of the clinical records of the patients who are seen in public hospitals in Madrid after a suicide attempt in a blind observation. METHODS: Observational, descriptive cross-sectional study conducted at four general public hospitals in Madrid (Spain). Analyses of the presence of seven indicators of information quality (previous psychiatric treatment, recent suicidal ideation, recent suicide planning behaviour, medical lethality of suicide attempt, previous suicide attempts, attitude towards the attempt, and social or family support) in 993 clinical records of 907 patients (64.5% women), ages ranging from 6 to 92 years (mean 37.1±15), admitted to hospital after a suicide attempt or who committed an attempt whilst in hospital. RESULTS: Of patients who attempted suicide, 94.9% received a psychosocial assessment. All seven indicators were documented in 22.5% of the records, whilst 23.6% recorded four or less than four indicators. Previous suicide attempts and medical lethality of current attempt were the indicators most often missed in the records. The study found no difference between the records of men and women (z=0.296; p=0.767, two tailed Mann-Whitney U test), although clinical records of patients discharged after an emergency unit intervention were more incomplete than the ones from hospitalised patients (z=2.731; p=0.006), and clinical records of repeaters were also more incomplete than the ones from non-repeaters (z=3.511; p<0.001). CONCLUSIONS: Clinical records of patients who have attempted suicide are not complete. The use of semi-structured screening instruments may improve the evaluation of patients who have self- harmed.
Resumo:
7th Mediterranean Conference on Information Systems, MCIS 2012, Guimaraes, Portugal, September 8-10, 2012, Proceedings Series: Lecture Notes in Business Information Processing, Vol. 129
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
We investigate whether the positive relation between accounting accruals and information asymmetry documented for U.S. stock markets also holds for European markets, considered as a whole and at the country level. This research is relevant because this relation is likely to be affected by differences in accounting standards used by companies for financial reporting, in the traditional use of the banking system or capital markets for firm financing, in legal systems and cultural environment. We find that in European stock markets discretionary accruals are positively related with the Corwin and Schultz high-low spread estimator used as a proxy for information asymmetry. Our results suggest that the earnings management component of accruals outweighs the informational component, but the significance of the relation varies across countries. Further, such association tends to be stronger for firms with the highest levels of positive discretionary accruals. Consistent with the evidence provided by the authors, our results also suggest that the high-low spread estimator is more efficient than the closing bid-ask spread when analysing the impact of information quality on information asymmetry.
Resumo:
The air void analyzer (AVA) with its independent isolation base can be used to accurately evaluate the air void system—including volume of entrained air, size of air voids, and distribution of air voids—of fresh portland cement concrete (PCC) on the jobsite. With this information, quality control adjustments in concrete batching can be made in real time to improve the air void system and thus increase freeze-thaw durability. This technology offers many advantages over current practices for evaluating air in concrete.
Resumo:
The objective of this study was to find out how project success can be measured in a case where the output of a project is an intangible information product, what kind of framework can be used to evaluate the project success, and how the project assessment can be done in practice. As a case example, the success of a business blueprint project was assessed from the product point of view. A framework for assessing business blueprint project success was made based on a literature review. Furthermore, separate frameworks for measuring information product quality and project costs were developed. The theory of business blueprinting was discovered not to be firmly institutionalized and it is briefly covered in the thesis. The possible net benefits from the strategic business process harmonization were noted to be much more significant than the costs of the business blueprint project. The project was seen as a sufficient success from the viewpoint of the created output.
Resumo:
In any enterprise, decisions need be made during the life cycle of information about its management. This requires information evaluation to take place; a little-understood process. For evaluation support to be both effective and resource efficient, some sort of automatic or semi-automatic evaluation method would be invaluable. Such a method would require an understanding of the diversity of the contexts in which evaluation takes place so that evaluation support can have the necessary context-sensitivity. This paper identifies the dimensions influencing the information evaluation process and defines the elements that characterise them, thus providing the foundations for a context-sensitive evaluation framework.
Resumo:
The research described here is supported by the award made by the RCUK Digital Economy program to the dot.rural Digital Economy Hub; award reference: EP/G066051/1.
Resumo:
Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.
Resumo:
Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.
Resumo:
Understanding how imperfect information affects firms' investment decision helps answer important questions in economics, such as how we may better measure economic uncertainty; how firms' forecasts would affect their decision-making when their beliefs are not backed by economic fundamentals; and how important are the business cycle impacts of changes in firms' productivity uncertainty in an environment of incomplete information. This dissertation provides a synthetic answer to all these questions, both empirically and theoretically. The first chapter, provides empirical evidence to demonstrate that survey-based forecast dispersion identifies a distinctive type of second moment shocks different from the canonical volatility shocks to productivity, i.e. uncertainty shocks. Such forecast disagreement disturbances can affect the distribution of firm-level beliefs regardless of whether or not belief changes are backed by changes in economic fundamentals. At the aggregate level, innovations that increase the dispersion of firms' forecasts lead to persistent declines in aggregate investment and output, which are followed by a slow recovery. On the contrary, the larger dispersion of future firm-specific productivity innovations, the standard way to measure economic uncertainty, delivers the ``wait and see" effect, such that aggregate investment experiences a sharp decline, followed by a quick rebound, and then overshoots. At the firm level, data uncovers that more productive firms increase investments given rises in productivity dispersion for the future, whereas investments drop when firms disagree more about the well-being of their future business conditions. These findings challenge the view that the dispersion of the firms' heterogeneous beliefs captures the concept of economic uncertainty, defined by a model of uncertainty shocks. The second chapter presents a general equilibrium model of heterogeneous firms subject to the real productivity uncertainty shocks and informational disagreement shocks. As firms cannot perfectly disentangle aggregate from idiosyncratic productivity because of imperfect information, information quality thus drives the wedge of difference between the unobserved productivity fundamentals, and the firms' beliefs about how productive they are. Distribution of the firms' beliefs is no longer perfectly aligned with the distribution of firm-level productivity across firms. This model not only explains why, at the macro and micro level, disagreement shocks are different from uncertainty shocks, as documented in Chapter 1, but helps reconcile a key challenge faced by the standard framework to study economic uncertainty: a trade-off between sizable business cycle effects due to changes in uncertainty, and the right amount of pro-cyclicality of firm-level investment rate dispersion, as measured by its correlation with the output cycles.
Resumo:
OBJETIVO: Avaliar a qualidade da informação registrada nas declarações de óbito fetal. MÉTODOS: Estudo documental com 710 óbitos fetais em hospitais de São Paulo, SP, no primeiro semestre de 2008, registrados na base unificada de óbitos da Fundação Sistema Estadual de Análise de Dados e da Secretaria de Estado da Saúde de São Paulo. Foi analisada a completitude das variáveis das declarações de óbito fetal emitidas por hospitais e Serviço de Verificação de Óbitos. Os registros das declarações de óbito de uma amostra de 212 óbitos fetais de hospitais do Sistema Único de Saúde foram comparados com os dados dos prontuários e do registro do Serviço de Verificação de Óbitos. RESULTADOS: Dentre as declarações de óbito, 75% foram emitidas pelo Serviço de Verificação de Óbitos, mais freqüente nos hospitais do Sistema Único de Saúde (78%). A completitude das variáveis das declarações de óbito emitidas pelos hospitais foi mais elevada e foi maior nos hospitais não pertencentes ao Sistema Único de Saúde. Houve maior completitude, concordância e sensibilidade nas declarações de óbito emitidas pelos hospitais. Houve baixa concordância e elevada especificidade para as variáveis relativas às características maternas. Maior registro das variáveis sexo, peso ao nascer e duração da gestação foi observada nas declarações emitidas no Serviço de Verificação de Óbitos. A autópsia não resultou em aprimoramento da indicação das causas de morte: a morte fetal não especificada representou 65,7% e a hipóxia intrauterina, 24,3%, enquanto nas declarações emitidas pelos hospitais foi de 18,1% e 41,7%, respectivamente. CONCLUSÕES: É necessário aprimorar a completitude e a indicação das causas de morte dos óbitos fetais. A elevada proporção de autópsias não melhorou a qualidade da informação e a indicação das causas de morte. A qualidade das informações geradas de autópsias depende do acesso às informações hospitalares.