43 resultados para Information resourses
Resumo:
The range of consumer health and medicines information sources has diversified along with the increased use of the Internet. This has led to a drive to develop medicines information services and to better incorporate the Internet and e-mail into routine practice in health care and in community pharmacies. To support the development of such services more information is needed about the use of online information by consumers, particularly of those who may be the most likely to use and to benefit from the new sources and modes of medicines communication. This study explored the role and utilization of the Internet-based medicines information and information services in the context of a wider network of information sources accessible to the public in Finland. The overall aim was to gather information to develop better and more accessible sources of information for consumers and services to better meet the needs of consumers. Special focus was on the needs and information behavior among people with depression and using antidepressant medicines. This study applied both qualitative and quantitative methods. Consumer medicines information needs and sources were identified by analyzing the utilization of the University Pharmacy operated national drug information call center (Study I) and surveying Finnish adults (n=2348) use of the different medicines information sources (Study II). The utilization of the Internet as a source of antidepressant information among people with depression was explored by focus group discussions among people with depression and with current or past use of the antidepressant(s) (n=29, Studies III & IV). Pharmacy response to the needs of consumers in term of providing e-mail counseling was assessed by conducting a virtual pseudo customer study among the Finnish community pharmacies (n=161, Study V). Physicians and pharmacists were the primary sources of medicines information. People with mental disorders were more frequent users of telephone- and Internet-based medicines information sources and patient information leaflets than people without mental disorders. These sources were used to complement rather than replace information provided face-to-face by health professionals. People with depression used the Internet to seek facts about antidepressants, to share experiences with peers, and for the curiosity. They described that the access to online drug information was empowering. Some people reported lacking the skills necessary to assess the quality of online information. E-mail medication counseling services provided by community pharmacies were rare and varied in quality. Study results suggest that rather than discouraging the use of the Internet, health professionals should direct patients to use accurate and reliable sources of online medicines information. Health care providers, including community pharmacies should also seek to develop new ways of communicating information about medicines with consumers. This study determined that people with depression and using antidepressants need services enabling interactive communication not only with health care professionals, but also with peers. Further research should be focused on developing medicines information service facilitating communication among different patient and consumer groups.
Resumo:
Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.
Resumo:
Cognitive impairments of attention, memory and executive functions are a fundamental feature of the pathophysiology of schizophrenia. The neurophysiological and neurochemical changes in the auditory cortex are shown to underlie cognitive impairmentsin schizophrenia patients. Functional state of the neural substrate of auditory information processing could be objectively and non-invasively probed with auditory event-related potentials (ERPs) and event- related fields (ERFs). In the current work, we explored the neurochemical effect on the neural origins of auditory information processing in relation to schizophrenia. By means of ERPs/ERFs we aimed to determine how neural substrates of auditory information processing are modulated by antipsychotic medication in schizophrenia spectrum patients (Studies I, II) and by neuropharmacological challenges in healthy human subjects (Studies III, IV). First, with auditory ERPs we investigated the effects of olanzapine (Study I) and risperidone (Study II) in a group of patients with schizophrenia spectrum disorders. After 2 and 4 weeks of treatment, olanzapine has no significant effects on mismatch negativity(MMN) and P300, which, as it has been suggested, respectively reflect preattentive and attention-dependent information processing. After 2 weeks of treatment, risperidone has no significant effect on P300, however risperidone reduces P200 amplitude. This latter effect of risperidone on neural resources responsible for P200 generation could be partly explained through the action of dopamine. Subsequently, we used simultaneous EEG/MEG to investigate the effects of memantine (Study III) and methylphenidate (Study IV) in healthy subjects. We found that memantine modulates MMN response without changing other ERP components. This could be interpreted as being due to the possible influence of memantine through the NMDA receptors on auditory change- detection mechanism, with processing of auditory stimuli remaining otherwise unchanged. Further, we found that methylphenidate does not modulate the MMN response. This finding could indicate no association between catecholaminergic activities and electrophysiological measures of preattentive auditory discrimination processes reflected in the MMN. However, methylphenidate decreases the P200 amplitudes. This could be interpreted as a modulation of auditory information processing reflected in P200 by dopaminergic and noradrenergic systems. Taken together, our set of studies indicates a complex pattern of neurochemical influences produced by the antipsychotic drugs in the neural substrate of auditory information processing in patients with schizophrenia spectrum disorders and by the pharmacological challenges in healthy subjects studied with ERPs and ERFs.
Resumo:
The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.
Resumo:
The purpose of this study is to describe the development of application of mass spectrometry for the structural analyses of non-coding ribonucleic acids during past decade. Mass spectrometric methods are compared of traditional gel electrophoretic methods, the characteristics of performance of mass spectrometric, analyses are studied and the future trends of mass spectrometry of ribonucleic acids are discussed. Non-coding ribonucleic acids are short polymeric biomolecules which are not translated to proteins, but which may affect the gene expression in all organisms. Regulatory ribonucleic acids act through transient interactions with key molecules in signal transduction pathways. Interactions are mediated through specific secondary and tertiary structures. Posttranscriptional modifications in the structures of molecules may introduce new properties to the organism, such as adaptation to environmental changes or development of resistance to antibiotics. In the scope of this study, the structural studies include i) determination of the sequence of nucleobases in the polymer chain, ii) characterisation and localisation of posttranscriptional modifications in nucleobases and in the backbone structure, iii) identification of ribonucleic acid-binding molecules and iv) probing of higher order structures in the ribonucleic acid molecule. Bacteria, archaea, viruses and HeLa cancer cells have been used as target organisms. Synthesised ribonucleic acids consisting of structural regions of interest have been frequently used. Electrospray ionisation (ESI) and matrix-assisted laser desorption ionisation (MALDI) have been used for ionisation of ribonucleic analytes. Ammonium acetate and 2-propanol are common solvents for ESI. Trihydroxyacetophenone is the optimal MALDI matrix for ionisation of ribonucleic acids and peptides. Ammonium salts are used in ESI buffers and MALDI matrices as additives to remove cation adducts. Reverse phase high performance liquid chromatography has been used for desalting and fractionation of analytes either off-line of on-line, coupled with ESI source. Triethylamine and triethylammonium bicarbonate are used as ion pair reagents almost exclusively. Fourier transform ion cyclotron resonance analyser using ESI coupled with liquid chromatography is the platform of choice for all forms of structural analyses. Time-of-flight (TOF) analyser using MALDI may offer sensitive, easy-to-use and economical solution for simple sequencing of longer oligonucleotides and analyses of analyte mixtures without prior fractionation. Special analysis software is used for computer-aided interpretation of mass spectra. With mass spectrometry, sequences of 20-30 nucleotides of length may be determined unambiguously. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Sequencing in conjunction with other structural studies enables accurate localisation and characterisation of posttranscriptional modifications and identification of nucleobases and amino acids at the sites of interaction. High throughput screening methods for RNA-binding ligands have been developed. Probing of the higher order structures has provided supportive data for computer-generated three dimensional models of viral pseudoknots. In conclusion. mass spectrometric methods are well suited for structural analyses of small species of ribonucleic acids, such as short non-coding ribonucleic acids in the molecular size region of 20-30 nucleotides. Structural information not attainable with other methods of analyses, such as nuclear magnetic resonance and X-ray crystallography, may be obtained with the use of mass spectrometry. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Ligand screening may be used in the search of possible new therapeutic agents. Demanding assay design and challenging interpretation of data requires multidisclipinary knowledge. The implement of mass spectrometry to structural studies of ribonucleic acids is probably most efficiently conducted in specialist groups consisting of researchers from various fields of science.
Resumo:
In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.
Resumo:
Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.
Resumo:
The study explores new ideational changes in the information strategy of the Finnish state between 1998 and 2007, after a juncture in Finnish governing in the early 1990s. The study scrutinizes the economic reframing of institutional openness in Finland that comes with significant and often unintended institutional consequences of transparency. Most notably, the constitutional principle of publicity (julkisuusperiaate), a Nordic institutional peculiarity allowing public access to state information, is now becoming an instrument of economic performance and accountability through results. Finland has a long institutional history in the publicity of government information, acknowledged by law since 1951. Nevertheless, access to government information became a policy concern in the mid-1990s, involving a historical narrative of openness as a Nordic tradition of Finnish governing Nordic openness (pohjoismainen avoimuus). International interest in transparency of governance has also marked an opening for institutional re-descriptions in Nordic context. The essential added value, or contradictory term, that transparency has on the Finnish conceptualisation of governing is the innovation that public acts of governing can be economically efficient. This is most apparent in the new attempts at providing standardised information on government and expressing it in numbers. In Finland, the publicity of government information has been a concept of democratic connotations, but new internationally diffusing ideas of performance and national economic competitiveness are discussed under the notion of transparency and its peer concepts openness and public (sector) information, which are also newcomers to Finnish vocabulary of governing. The above concepts often conflict with one another, paving the way to unintended consequences for the reforms conducted in their name. Moreover, the study argues that the policy concerns over openness and public sector information are linked to the new drive for transparency. Drawing on theories of new institutionalism, political economy, and conceptual history, the study argues for a reinvention of Nordic openness in two senses. First, in referring to institutional history, the policy discourse of Nordic openness discovers an administrative tradition in response to new dilemmas of public governance. Moreover, this normatively appealing discourse also legitimizes the new ideational changes. Second, a former mechanism of democratic accountability is being reframed with market and performance ideas, mostly originating from the sphere of transnational governance and governance indices. Mobilizing different research techniques and data (public documents of the Finnish government and international organizations, some 30 interviews of Finnish civil servants, and statistical time series), the study asks how the above ideational changes have been possible, pointing to the importance of nationalistically appealing historical narratives and normative concepts of governing. Concerning institutional developments, the study analyses the ideational changes in central steering mechanisms (political, normative and financial steering) and the introduction of budget transparency and performance management in two cases: census data (Population Register Centre) and foreign political information (Ministry for Foreign Affairs). The new policy domain of governance indices is also explored as a type of transparency. The study further asks what institutional transformations are to be observed in the above cases and in the accountability system. The study concludes that while the information rights of citizens have been reinforced and recalibrated during the period under scrutiny, there has also been a conversion of institutional practices towards economic performance. As the discourse of Nordic openness has been rather unquestioned, the new internationally circulating ideas of transparency and the knowledge economy have entered this discourse without public notice. Since the mid 1990s, state registry data has been perceived as an exploitable economic resource in Finland and in the EU public sector information. This is a parallel development to the new drive for budget transparency in organisations as vital to the state as the Population Register Centre, which has led to marketization of census data in Finland, an international exceptionality. In the Finnish Ministry for Foreign Affairs, the post-Cold War rhetorical shift from secrecy to performance-driven openness marked a conversion in institutional practices that now see information services with high regards. But this has not necessarily led to the increased publicity of foreign political information. In this context, openness is also defined as sharing information with select actors, as a trust based non-public activity, deemed necessary amid the global economic competition. Regarding accountability system, deliberation and performance now overlap, making it increasingly difficult to identify to whom and for what the public administration is accountable. These evolving institutional practices are characterised by unintended consequences and paradoxes. History is a paradoxical component in the above institutional change, as long-term institutional developments now justify short-term reforms.
Resumo:
The information that the economic agents have and regard relevant to their decision making is often assumed to be exogenous in economics. It is assumed that the agents either poses or can observe the payoff relevant information without having to exert any effort to acquire it. In this thesis we relax the assumption of ex-ante fixed information structure and study what happens to the equilibrium behavior when the agents must also decide what information to acquire and when to acquire it. This thesis addresses this question in the two essays on herding and two essays on auction theory. In the first two essays, that are joint work with Klaus Kultti, we study herding models where it is costly to acquire information on the actions that the preceding agents have taken. In our model the agents have to decide both the action that they take and additionally the information that they want to acquire by observing their predecessors. We characterize the equilibrium behavior when the decision to observe preceding agents' actions is endogenous and show how the equilibrium outcome may differ from the standard model, where all preceding agents actions are assumed to be observable. In the latter part of this thesis we study two dynamic auctions: the English and the Dutch auction. We consider a situation where bidder(s) are uninformed about their valuations for the object that is put up for sale and they may acquire this information for a small cost at any point during the auction. We study the case of independent private valuations. In the third essay of the thesis we characterize the equilibrium behavior in an English auction when there are informed and uninformed bidders. We show that the informed bidder may jump bid and signal to the uninformed that he has a high valuation, thus deterring the uninformed from acquiring information and staying in the auction. The uninformed optimally acquires information once the price has passed a particular threshold and the informed has not signalled that his valuation is high. In addition, we provide an example of an information structure where the informed bidder initially waits and then makes multiple jumps. In the fourth essay of this thesis we study the Dutch auction. We consider two cases where all bidders are all initially uninformed. In the first case the information acquisition cost is the same across all bidders and in the second also the cost of information acquisition is independently distributed and private information to the bidders. We characterize a mixed strategy equilibrium in the first and a pure strategy equilibrium in the second case. In addition we provide a conjecture of an equilibrium in an asymmetric situation where there is one informed and one uninformed bidder. We compare the revenues that the first price auction and the Dutch auction generate and we find that under some circumstances the Dutch auction outperforms the first price sealed bid auction. The usual first price sealed bid auction and the Dutch auction are strategically equivalent. However, this equivalence breaks down in case information is acquired during the auction.