891 resultados para information source


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The range of consumer health and medicines information sources has diversified along with the increased use of the Internet. This has led to a drive to develop medicines information services and to better incorporate the Internet and e-mail into routine practice in health care and in community pharmacies. To support the development of such services more information is needed about the use of online information by consumers, particularly of those who may be the most likely to use and to benefit from the new sources and modes of medicines communication. This study explored the role and utilization of the Internet-based medicines information and information services in the context of a wider network of information sources accessible to the public in Finland. The overall aim was to gather information to develop better and more accessible sources of information for consumers and services to better meet the needs of consumers. Special focus was on the needs and information behavior among people with depression and using antidepressant medicines. This study applied both qualitative and quantitative methods. Consumer medicines information needs and sources were identified by analyzing the utilization of the University Pharmacy operated national drug information call center (Study I) and surveying Finnish adults (n=2348) use of the different medicines information sources (Study II). The utilization of the Internet as a source of antidepressant information among people with depression was explored by focus group discussions among people with depression and with current or past use of the antidepressant(s) (n=29, Studies III & IV). Pharmacy response to the needs of consumers in term of providing e-mail counseling was assessed by conducting a virtual pseudo customer study among the Finnish community pharmacies (n=161, Study V). Physicians and pharmacists were the primary sources of medicines information. People with mental disorders were more frequent users of telephone- and Internet-based medicines information sources and patient information leaflets than people without mental disorders. These sources were used to complement rather than replace information provided face-to-face by health professionals. People with depression used the Internet to seek facts about antidepressants, to share experiences with peers, and for the curiosity. They described that the access to online drug information was empowering. Some people reported lacking the skills necessary to assess the quality of online information. E-mail medication counseling services provided by community pharmacies were rare and varied in quality. Study results suggest that rather than discouraging the use of the Internet, health professionals should direct patients to use accurate and reliable sources of online medicines information. Health care providers, including community pharmacies should also seek to develop new ways of communicating information about medicines with consumers. This study determined that people with depression and using antidepressants need services enabling interactive communication not only with health care professionals, but also with peers. Further research should be focused on developing medicines information service facilitating communication among different patient and consumer groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been known for decades that particles can cause adverse health effects as they are deposited within the respiratory system. Atmospheric aerosol particles influence climate by scattering solar radiation but aerosol particles act also as the nuclei around which cloud droplets form. The principal objectives of this thesis were to investigate the chemical composition and the sources of fine particles in different environments (traffic, urban background, remote) as well as during some specific air pollution situations. Quantifying the climate and health effects of atmospheric aerosols is not possible without detailed information of the aerosol chemical composition. Aerosol measurements were carried out at nine sites in six countries (Finland, Germany, Czech, Netherlands, Greece and Italy). Several different instruments were used in order to measure both the particulate matter (PM) mass and its chemical composition. In the off-line measurements the samples were collected first on a substrate or filter and gravimetric and chemical analysis were conducted in the laboratory. In the on-line measurements the sampling and analysis were either a combined procedure or performed successively within the same instrument. Results from the impactor samples were analyzed by the statistical methods. This thesis comprises also a work where a method for the determination carbonaceous matter size distribution by using a multistage impactor was developed. It was found that the chemistry of PM has usually strong spatial, temporal and size-dependent variability. In the Finnish sites most of the fine PM consisted of organic matter. However, in Greece sulfate dominated the fine PM and in Italy nitrate made the largest contribution to the fine PM. Regarding the size-dependent chemical composition, organic components were likely to be enriched in smaller particles than inorganic ions. Data analysis showed that organic carbon (OC) had four major sources in Helsinki. Secondary production was the major source in Helsinki during spring, summer and fall, whereas in winter biomass combustion dominated OC. The significant impact of biomass combustion on OC concentrations was also observed in the measurements performed in Central Europe. In this thesis aerosol samples were collected mainly by the conventional filter and impactor methods which suffered from the long integration time. However, by filter and impactor measurements chemical mass closure was achieved accurately, and a simple filter sampling was found to be useful in order to explain the sources of PM on the seasonal basis. The online instruments gave additional information related to the temporal variations of the sources and the atmospheric mixing conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent epidemiological studies have shown a consistent association of the mass concentration of urban air thoracic (PM10) and fine (PM2.5) particles with mortality and morbidity among cardiorespiratory patients. However, the chemical characteristics of different particulate size ranges and the biological mechanisms responsible for these adverse health effects are not well known. The principal aims of this thesis were to validate a high volume cascade impactor (HVCI) for the collection of particulate matter for physicochemical and toxicological studies, and to make an in-depth chemical and source characterisation of samples collected during different pollution situations. The particulate samples were collected with the HVCI, virtual impactors and a Berner low pressure impactor in six European cities: Helsinki, Duisburg, Prague, Amsterdam, Barcelona and Athens. The samples were analysed for particle mass, common ions, total and water-soluble elements as well as elemental and organic carbon. Laboratory calibration and field comparisons indicated that the HVCI can provide a unique large capacity, high efficiency sampling of size-segregated aerosol particles. The cutoff sizes of the recommended HVCI configuration were 2.4, 0.9 and 0.2 μm. The HVCI mass concentrations were in a good agreement with the reference methods, but the chemical composition of especially the fine particulate samples showed some differences. This implies that the chemical characterization of the exposure variable in toxicological studies needs to be done from the same HVCI samples as used in cell and animal studies. The data from parallel, low volume reference samplers provide valuable additional information for chemical mass closure and source assessment. The major components of PM2.5 in the virtual impactor samples were carbonaceous compounds, secondary inorganic ions and sea salt, whereas those of coarse particles (PM2.5-10) were soil-derived compounds, carbonaceous compounds, sea salt and nitrate. The major and minor components together accounted for 77-106% and 77-96% of the gravimetrically-measured masses of fine and coarse particles, respectively. Relatively large differences between sampling campaigns were observed in the organic carbon content of the PM2.5 samples as well as the mineral composition of the PM2.5-10 samples. A source assessment based on chemical tracers suggested clear differences in the dominant sources (e.g. traffic, residential heating with solid fuels, metal industry plants, regional or long-range transport) between the sampling campaigns. In summary, the field campaigns exhibited different profiles with regard to particulate sources, size distribution and chemical composition, thus, providing a highly useful setup for toxicological studies on the size-segregated HVCI samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to describe the development of application of mass spectrometry for the structural analyses of non-coding ribonucleic acids during past decade. Mass spectrometric methods are compared of traditional gel electrophoretic methods, the characteristics of performance of mass spectrometric, analyses are studied and the future trends of mass spectrometry of ribonucleic acids are discussed. Non-coding ribonucleic acids are short polymeric biomolecules which are not translated to proteins, but which may affect the gene expression in all organisms. Regulatory ribonucleic acids act through transient interactions with key molecules in signal transduction pathways. Interactions are mediated through specific secondary and tertiary structures. Posttranscriptional modifications in the structures of molecules may introduce new properties to the organism, such as adaptation to environmental changes or development of resistance to antibiotics. In the scope of this study, the structural studies include i) determination of the sequence of nucleobases in the polymer chain, ii) characterisation and localisation of posttranscriptional modifications in nucleobases and in the backbone structure, iii) identification of ribonucleic acid-binding molecules and iv) probing of higher order structures in the ribonucleic acid molecule. Bacteria, archaea, viruses and HeLa cancer cells have been used as target organisms. Synthesised ribonucleic acids consisting of structural regions of interest have been frequently used. Electrospray ionisation (ESI) and matrix-assisted laser desorption ionisation (MALDI) have been used for ionisation of ribonucleic analytes. Ammonium acetate and 2-propanol are common solvents for ESI. Trihydroxyacetophenone is the optimal MALDI matrix for ionisation of ribonucleic acids and peptides. Ammonium salts are used in ESI buffers and MALDI matrices as additives to remove cation adducts. Reverse phase high performance liquid chromatography has been used for desalting and fractionation of analytes either off-line of on-line, coupled with ESI source. Triethylamine and triethylammonium bicarbonate are used as ion pair reagents almost exclusively. Fourier transform ion cyclotron resonance analyser using ESI coupled with liquid chromatography is the platform of choice for all forms of structural analyses. Time-of-flight (TOF) analyser using MALDI may offer sensitive, easy-to-use and economical solution for simple sequencing of longer oligonucleotides and analyses of analyte mixtures without prior fractionation. Special analysis software is used for computer-aided interpretation of mass spectra. With mass spectrometry, sequences of 20-30 nucleotides of length may be determined unambiguously. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Sequencing in conjunction with other structural studies enables accurate localisation and characterisation of posttranscriptional modifications and identification of nucleobases and amino acids at the sites of interaction. High throughput screening methods for RNA-binding ligands have been developed. Probing of the higher order structures has provided supportive data for computer-generated three dimensional models of viral pseudoknots. In conclusion. mass spectrometric methods are well suited for structural analyses of small species of ribonucleic acids, such as short non-coding ribonucleic acids in the molecular size region of 20-30 nucleotides. Structural information not attainable with other methods of analyses, such as nuclear magnetic resonance and X-ray crystallography, may be obtained with the use of mass spectrometry. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Ligand screening may be used in the search of possible new therapeutic agents. Demanding assay design and challenging interpretation of data requires multidisclipinary knowledge. The implement of mass spectrometry to structural studies of ribonucleic acids is probably most efficiently conducted in specialist groups consisting of researchers from various fields of science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urbanization leads to irreversible land-use change, which has ecological consequences such as the loss and fragmentation of green areas, and structural and functional changes in terrestrial and aquatic ecosystems. These consequences diminish ecosystem services important for human populations living in urban areas. All this results in a conflict situation: how to simultaneously meet the needs of city growth and the principles of sustainable development, and especially conserve important green areas within and around built-up areas? Urban planners and decisionmakers have an important role in this, since they must use the ecological information mainly from species and biotope inventories and biodiversity impact assessments in determining the conservation values of green areas. The main aim of this thesis was to study the use of ecological information in the urban land-use planning and decisionmaking process in the Helsinki Metropolitan Area, Finland. At first, the literature on ecological-social systems linkages related to urban planning was reviewed. Based on the review, a theoretical and conceptual framework for the research on Finnish urban setting was adapted. Secondly, factors determining the importance and effectiveness of incorporation of ecological information into the urban planning process, and the challenges related to the use of ecological information were studied. Thirdly, the importance and use of Local Ecological Knowledge in urban planning were investigated. Then, factors determining the consideration of urban green areas and related ecological information in political land-use decisionmaking were studied. Finally, in a case study illustrating the above considerations, the importance of urban stream ecosystems in the land-use planning was investigated. This thesis demonstrated that although there are several challenges in using ecological information effectively, it is considered as an increasingly important part of the basic information used in urban planning and decisionmaking process. The basic determinants for this are the recent changes in environmental legislation, but also the increasing appreciation of green areas and their conservation values by all the stakeholders. In addition, Local Ecological Knowledge in its several forms can be a source of ecological information for planners if incorporated effectively into the process. This study also showed that rare or endangered species and biotopes, and related ecological information receive priority in the urban planning process and usually pass through the decisionmaking system. Furthermore, the stream Rekolanoja case indicates that planners and residents see the value of urban stream ecosystem as increasingly important for the local health and social values, such as recreation and stress relief.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The insulin receptor (IR), the insulin-like growth factor 1 receptor (IGF1R) and the insulin receptor-related receptor (IRR) are covalently-linked homodimers made up of several structural domains. The molecular mechanism of ligand binding to the ectodomain of these receptors and the resulting activation of their tyrosine kinase domain is still not well understood. We have carried out an amino acid residue conservation analysis in order to reconstruct the phylogeny of the IR Family. We have confirmed the location of ligand binding site 1 of the IGF1R and IR. Importantly, we have also predicted the likely location of the insulin binding site 2 on the surface of the fibronectin type III domains of the IR. An evolutionary conserved surface on the second leucine-rich domain that may interact with the ligand could not be detected. We suggest a possible mechanical trigger of the activation of the IR that involves a slight ‘twist’ rotation of the last two fibronectin type III domains in order to face the likely location of insulin. Finally, a strong selective pressure was found amongst the IRR orthologous sequences, suggesting that this orphan receptor has a yet unknown physiological role which may be conserved from amphibians to mammals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corporate executives require relevant and intelligent business information in real-time to take strategic decisions. They require the freedom to access this information anywhere and anytime. There is a need to extend this functionality beyond the office and on the fingertips of the decision makers. Mobile Business Intelligence Tool (MBIT) aims to provide these features in a flexible and cost-efficient manner. This paper describes the detailed architecture of MBIT to overcome the limitations of existing mobile business intelligence tools. Further, a detailed implementation framework is presented to realize the design. This research highlights the benefits of using service oriented architecture to design flexible and platform independent mobile business applications. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A geodesic-based approach using Lamb waves is proposed to locate the acoustic emission (AE) source and damage in an isotropic metallic structure. In the case of the AE (passive) technique, the elastic waves take the shortest path from the source to the sensor array distributed in the structure. The geodesics are computed on the meshed surface of the structure using graph theory based on Dijkstra's algorithm. By propagating the waves in reverse virtually from these sensors along the geodesic path and by locating the first intersection point of these waves, one can get the AE source location. The same approach is extended for detection of damage in a structure. The wave response matrix of the given sensor configuration for the healthy and the damaged structure is obtained experimentally. The healthy and damage response matrix is compared and their difference gives the information about the reflection of waves from the damage. These waves are backpropagated from the sensors and the above method is used to locate the damage by finding the point where intersection of geodesics occurs. In this work, the geodesic approach is shown to be suitable to obtain a practicable source location solution in a more general set-up on any arbitrary surface containing finite discontinuities. Experiments were conducted on aluminum specimens of simple and complex geometry to validate this new method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of transmission of correlated discrete alphabet sources over a Gaussian Multiple Access Channel (GMAC). A distributed bit-to-Gaussian mapping is proposed which yields jointly Gaussian codewords. This can guarantee lossless transmission or lossy transmission with given distortions, if possible. The technique can be extended to the system with side information at the encoders and decoder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a quantity called information ambiguity that plays the same role in the worst-case information-theoretic nalyses as the well-known notion of information entropy performs in the corresponding average-case analyses. We prove various properties of information ambiguity and illustrate its usefulness in performing the worst-case analysis of a variant of distributed source coding problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses original survey data of the Great East Japan earthquake disaster victims to examine their decision to apply for the temporary housing as well as the timing of application. We assess the effects of victims’ attachment to their locality as well as variation in victims’ information seeking behavior. We additionally consider various factors such as income, age, employment and family structure that are generally considered to affect the decision to choose temporary housing as victims’ solution for their displacement. Empirical results indicate that, ceteris paribus, as the degree of attachment increases, victims are more likely to apply for the temporary housing but attachment does not affect the timing of application. On the other hand, the victims who actively seek information and are able to collect higher quality information are less likely to apply for the temporary housing and if they do apply then they apply relatively later.