43 resultados para information quality
em Aston University Research Archive
Resumo:
INTRODUCTION: The aim of the study was to assess the quality of the clinical records of the patients who are seen in public hospitals in Madrid after a suicide attempt in a blind observation. METHODS: Observational, descriptive cross-sectional study conducted at four general public hospitals in Madrid (Spain). Analyses of the presence of seven indicators of information quality (previous psychiatric treatment, recent suicidal ideation, recent suicide planning behaviour, medical lethality of suicide attempt, previous suicide attempts, attitude towards the attempt, and social or family support) in 993 clinical records of 907 patients (64.5% women), ages ranging from 6 to 92 years (mean 37.1±15), admitted to hospital after a suicide attempt or who committed an attempt whilst in hospital. RESULTS: Of patients who attempted suicide, 94.9% received a psychosocial assessment. All seven indicators were documented in 22.5% of the records, whilst 23.6% recorded four or less than four indicators. Previous suicide attempts and medical lethality of current attempt were the indicators most often missed in the records. The study found no difference between the records of men and women (z=0.296; p=0.767, two tailed Mann-Whitney U test), although clinical records of patients discharged after an emergency unit intervention were more incomplete than the ones from hospitalised patients (z=2.731; p=0.006), and clinical records of repeaters were also more incomplete than the ones from non-repeaters (z=3.511; p<0.001). CONCLUSIONS: Clinical records of patients who have attempted suicide are not complete. The use of semi-structured screening instruments may improve the evaluation of patients who have self- harmed.
Resumo:
Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.
Resumo:
This study explores the institutional logic(s) governing the Corporate Internet Reporting (CIR) by Egyptian listed companies. In doing so, a mixed methods approach was followed. The qualitative part seeks to understand the perceptions, believes, values, norms, that are commonly shared by Egyptian companies which engaged in these practices. Consequently, seven cases of large listed Egyptian companies operating in different industries have been examined. Other stakeholders and stockholders have been interviewed in conjunction with these cases. The quantitative part consists of two studies. The first one is descriptive aiming to specify whether the induced logic(s) from the seven cases are commonly embraced by other Egyptian companies. The second study is explanatory aiming to investigate the impact of several institutional and economic factors on the extent of CIR, types of the online information, quality of the websites as well as the Internet facilities. Drawing on prior CIR literature, four potential types of logics could be inferred: efficiency, legitimacy, technical and marketing based logics. In Egypt, legitimacy logic was initially embraced in the earlier years after the Internet inception. latter, companies confronted radical challenges in their internal and external environments which impelled them to raise their websites potentialities to defend their competitive position; either domestically or internationally. Thus, two new logics emphasizing marketing and technical perspectives have emerged, in response. Strikingly, efficiency based logic is not the most prevalent logic driving CIR practices in Egypt as in the developed countries. The empirical results support this observation and show that almost half of Egyptian listed companies 115 as on December 2010 possessed an active website, half of them 62 disclosed part of their financial and accounting information, during December 2010 to February 2011. Less than half of the websites 52 offered latest annual financial statements. Fewer 33(29%) websites provided shareholders and stock information or included a separate section for corporate governance 25 (22%) compared to 50 (44%) possessing a section for news or press releases. Additionally, the variations in CIR practices, as well as timeliness and credibility were also evident even at industrial level. After controlling for firm size, profitability, leverage, liquidity, competition and growth, it was realized that industrial companies and those facing little competition tend to disclose less. In contrast, management size, foreign investors, foreign listing, dispersion of shareholders and firm size provided significant and positive impact individually or collectively. In contrast, neither audit firm, nor most of performance indicators (i.e. profitability, leverage, and liquidity) did exert an influence on the CIR practices. Thus, it is suggested that CIR practices are loosely institutionalised in Egypt, which necessitates issuing several regulative and processional rules to raise the quality attributes of Egyptian websites, especially, timeliness and credibility. Beside, this study highlights the potency of assessing the impact of institutional logic on CIR practices and suggests paying equal attention to the institutional and economic factors when comparing the CIR practices over time or across different institutional environments in the future.
Resumo:
Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.
Resumo:
The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.
Resumo:
Previously, quality of formulations information provided for oral medications used in paediatric clinical trials published in 10 highly cited journals between 2002 and 2004 raised concerns. This short report explores if there was any subsequent improvement on how the formulations used in trials involving children
Resumo:
Journal ranking studies have generally adopted citation techniques or academic perceptions as the basis for assessing journal quality. They have traditionally been a source of information about potential research outlets, new journals, and an aid to developing a consensus about the relative merit of publications for promotion decisions. The aim of our research is to address specific shortcomings in the conventional literature and construct an alternative view of how we might more appropriately assess journal ‘quality’. We attempt to engage with the conventional literature by applying an approach that does not privilege either citation techniques or academic perceptions. We have adopted from Zeff (1996) an objective measure of academic journal library holdings, which Zeff describes as a ‘market test’. Our construct provides evidence of an important difference in journal holdings for the Australasian region that could significantly influence further research on journal quality. The method itself is entirely mundane but may be considered to reflect a complex of historic and more contemporary variables which impact on academic and administrative decisions, influencing the makeup of academic library holdings and providing a proxy for journal ‘quality’.
Resumo:
This paper considers the empirical determinants of the quality of information disclosed about directors’ share options in a sample of large companies in 1994 and 1995. Policy recommendations, consolidated in the recommendations of the Greenbury report, argue for full and complete disclosure of director option information. In this paper two modest contributions to the UK empirical literature are made. First, the current degree of option information disclosure in the FTSE 350 companies is documented. Second, option information disclosure as a function of variables that are thought to in¯uence corporate costs of disclosure is modelled. The results have implications for corporate governance. Speci®cally, support is oVered for the monitoring function of nonexecutive directors. In addition, nondisclosure is found to be related to variables which proxy proprietary costs of revealing information (such as company size).
Resumo:
Aim: To investigate the experiences of people with macular disease within the British healthcare system. Method: The Macular Disease Society Questionnaire, a self completion questionnaire designed to survey the experiences of people with macular disease, was sent to 2000 randomly selected members of the Macular Disease Society. The questionnaire incorporated items about people's experiences with health professionals and the information and support provided by them at the time of diagnosis and thereafter. Results: Over 50% thought their consultant eye specialist was not interested in them as a person and 40% were dissatisfied with their diagnostic consultation. 185 people thought their general practitioner (GP) was well informed about macular disease but twice as many people thought their GP was not well informed. About an equal number of people thought their GP was supportive as those who thought their GP was not supportive. A total of 1247 people were told "nothing can be done to help with your macular disease." A number of negative emotional reactions were experienced by those people as a result, with 61% of them reporting feeling anxious or depressed. Of 282 people experiencing visual hallucinations after diagnosis with macular disease, only 20.9% were offered explanations for them. Concluslons: Many people with macular disease have unsatisfactory experiences of the healthcare system. Many of the reasons for dissatisfaction could be resolved by healthcare professionals if they were better informed about macular disease and had a better understanding of and empathy with patients' experiences.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
On the basis of a review of the substantive quality and service marketing literature current knowledge regarding service quality expectations was found either absent or deficient. The phenomenon is of increasing importance to both marketing researchers and management and was therefore judged worthy of scholarly consideration. Because the service quality literature was insufficiently rich when embarking on the thesis three basic research issues were considered namely the nature, determinants, and dynamics of service quality expectations. These issues were first conceptually and then qualitatively explored. This process generated research hypotheses mainly relating to a model which were subsequently tested through a series of empirical investigations using questionnaire data from field studies in a single context. The results were internally consistent and strongly supported the main research hypotheses. It was found that service quality expectations can be meaningfully described in terms of generic/service-specific, intangible/tangible, and process/outcome categories. Service-specific quality expectations were also shown to be determined by generic service quality expectations, demographic variables, personal values, psychological needs, general service sophistication, service-specific sophistication, purchase motives, and service-specific information when treating service class involvement as an exogenous variable. Subjects who had previously not directly experienced a particular service were additionally found to revise their expectations of quality when exposed to the service with change being driven by a sub-set of identified determinants.
Resumo:
This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.
Resumo:
Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.
Resumo:
A broad based approach has been used to assess the impact of discharges to rivers from surface water sewers, with the primary objective of determining whether such discharges have a measurable impact on water quality. Three parameters, each reflecting the effects of intermittent pollution, were included in a field work programme of biological and chemical sampling and analysis which covered 47 sewer outfall sites. These parameters were the numbers and types of benthic macroinvertebrates upstream and downstream of the outfalls, the concentrations of metals in sediments, and the concentrations of metals in algae upstream and downstream of the outfalls. Information on the sewered catchments was collected from Local Authorities and by observation of the time of sampling, and includes catchment areas, land uses, evidence of connection to the foul system, and receiving water quality classification. The methods used for site selection, sampling, laboratory analysis and data analysis are fully described, and the survey results presented. Statistical and graphical analysis of the biological data, with the aid of BMWP scores, showed that there was a small but persistent fall in water quality downstream of the studied outfalls. Further analysis including the catchment information indicated that initial water quality, sewered catchment size, receiving stream size, and catchment land use were important factors in determining the impact. Finally, the survey results were used to produce guidelines for the estimation of surface water sewer discharge impacts from knowledge of the catchment characteristics, so that planning authorities can consider water quality when new drainage systems are designed.