961 resultados para Collection development (Libraries)
Resumo:
Viime vuosien nopea kehitys on kiihdyttänyt uusien lääkkeiden kehittämisprosessia. Kombinatorinen kemia on tehnyt mahdolliseksi syntetisoida suuria kokoelmia rakenteeltaan toisistaan poikkeavia molekyylejä, nk. kombinatorisia kirjastoja, biologista seulontaa varten. Siinä molekyylien rakenteeseen liittyvä aktiivisuus tutkitaan useilla erilaisilla biologisilla testeillä mahdollisten "osumien" löytämiseksi, joista osasta saatetaan myöhemmin kehittää uusia lääkeaineita. Jotta biologisten tutkimusten tulokset olisivat luotettavia, on syntetisoitujen komponenttien oltava mahdollisimman puhtaita. Tämän vuoksi tarvitaan HTP-puhdistusta korkealaatuisten komponenttien ja luotettavan biologisen tiedon takaamiseksi. Jatkuvasti kasvavat tuotantovaatimukset ovat johtaneet näiden puhdistustekniikoiden automatisointiin ja rinnakkaistamiseen. Preparatiivinen LC/MS soveltuu kombinatoristen kirjastojen nopeaan ja tehokkaaseen puhdistamiseen. Monet tekijät, esimerkiksi erotuskolonnin ominaisuudet sekä virtausgradientti, vaikuttavat preparatiivisen LC/MS puhdistusprosessin tehokkuuteen. Nämä parametrit on optimoitava parhaan tuloksen saamiseksi. Tässä työssä tutkittiin emäksisiä komponentteja erilaisissa virtausolosuhteissa. Menetelmä kombinatoristen kirjastojen puhtaustason määrittämiseksi LC/MS-puhdistuksen jälkeen optimoitiin ja määritettiin puhtaus joillekin komponenteille eri kirjastoista ennen puhdistusta.
Resumo:
Ohjelmistoteollisuudessa pitkiä ja vaikeita kehityssyklejä voidaan helpottaa käyttämällä hyväksi ohjelmistokehyksiä (frameworks). Ohjelmistokehykset edustavat kokoelmaa luokkia, jotka tarjoavat yleisiä ratkaisuja tietyn ongelmakentän tarpeisiin vapauttaen ohjelmistokehittäjät keskittymään sovelluskohtaisiin vaatimuksiin. Hyvin suunniteltujen ohjelmistokehyksien käyttö lisää suunnitteluratkaisujen sekä lähdekoodin uudelleenkäytettävyyttä enemmän kuin mikään muu suunnittelulähestymistapa. Tietyn kohdealueen tietämys voidaan tallentaa ohjelmistokehyksiin, joista puolestaan voidaan erikoistaa viimeisteltyjä ohjelmistotuotteita. Tässä diplomityössä kuvataan ohjelmistoagentteihin (software agents) perustuvaa ohjelmistokehyksen suunnittelua toteutusta. Pääpaino työssä on vaatimusmäärittelyä vastaavan suunnitelman sekä toteutuksen kuvaaminen ohjelmistokehykselle, josta voidaan erikoistaa erilaiseen tiedonkeruuseen kykeneviä ohjelmistoja Internet ympäristöön. Työn kokeellisessa osuudessa esitellään myös esimerkkisovellus, joka perustuu työssä kehitettyyn ohjelmistokehykseen.
Resumo:
Tutkielman tavoitteena on luoda liiketoimintamalli, joka tukee langattomien matkaviestintäpalveluiden markkinoiden luomista kehittyvillä markkinoilla. Teoreettinen osa tarkastelee langattomien matkaviestintäpalveluiden liiketoimintamallin kehittämisen tärkeimpiä elementtejä CIS maissa. Teoreettisen kappaleen tuloksena saadaan puitteet, jonka avulla liiketoimintamalli matkaviestintäpalveluille voidaan kehittää. Tutkielman empiirinen osa on toteutettu case tutkimuksena, jonka tavoitteena on ollut langattomien matkaviestintäpalvelujen markkinoiden luominen CIS maissa. Pääasiallinen empiirisen tiedon lähde on ollut teemahaastattelut. Tuloksena saatuja empiirisen osan tietoja verrataan teoriakappaleen vastaaviin tuloksiin Tulokset osoittavat, että radikaalin korkean teknologian innovaation markkinoiden luominen on hidas prosessi, joka vaatii kärsivällisyyttä yritykseltä. Markkinoiden, teknologian ja strategian epävarmuustekijät tuovat epävarmuutta kehittyvälle toimialalle ja markkinoille, joka vaikeuttaa liiketoimintamallin kehittämistä. Tärkein tekijä on palvelujen markkinointi ennemmin kuin teknologian. Avain kyvykkyys markkinoiden luomisessa on oppiminen, ei tietäminen.
Resumo:
The article discusses the development of WEBDATANET established in 2011 which aims to create a multidisciplinary network of web-based data collection experts in Europe. Topics include the presence of 190 experts in 30 European countries and abroad, the establishment of web-based teaching and discussion platforms and working groups and task forces. Also discussed is the scope of the research carried by WEBDATANET. In light of the growing importance of web-based data in the social and behavioral sciences, WEBDATANET was established in 2011 as a COST Action (IS 1004) to create a multidisciplinary network of web-based data collection experts: (web) survey methodologists, psychologists, sociologists, linguists, economists, Internet scientists, media and public opinion researchers. The aim was to accumulate and synthesize knowledge regarding methodological issues of web-based data collection (surveys, experiments, tests, non-reactive data, and mobile Internet research), and foster its scientific usage in a broader community.
Resumo:
Despite the development of novel typing methods based on whole genome sequencing, most laboratories still rely on classical molecular methods for outbreak investigation or surveillance. Reference methods for Clostridium difficile include ribotyping and pulsed-field gel electrophoresis, which are band-comparing methods often difficult to establish and which require reference strain collections. Here, we present the double locus sequence typing (DLST) scheme as a tool to analyse C. difficile isolates. Using a collection of clinical C. difficile isolates recovered during a 1-year period, we evaluated the performance of DLST and compared the results to multilocus sequence typing (MLST), a sequence-based method that has been used to study the structure of bacterial populations and highlight major clones. DLST had a higher discriminatory power compared to MLST (Simpson's index of diversity of 0.979 versus 0.965) and successfully identified all isolates of the study (100 % typeability). Previous studies showed that the discriminatory power of ribotyping was comparable to that of MLST; thus, DLST might be more discriminatory than ribotyping. DLST is easy to establish and provides several advantages, including absence of DNA extraction [polymerase chain reaction (PCR) is performed on colonies], no specific instrumentation, low cost and unambiguous definition of types. Moreover, the implementation of a DLST typing scheme on an Internet database, such as that previously done for Staphylococcus aureus and Pseudomonas aeruginosa ( http://www.dlst.org ), will allow users to easily obtain the DLST type by submitting directly sequencing files and will avoid problems associated with multiple databases.
Resumo:
In the recent years, many protocols aimed at reproducibly sequencing reduced-genome subsets in non-model organisms have been published. Among them, RAD-sequencing is one of the most widely used. It relies on digesting DNA with specific restriction enzymes and performing size selection on the resulting fragments. Despite its acknowledged utility, this method is of limited use with degraded DNA samples, such as those isolated from museum specimens, as these samples are less likely to harbor fragments long enough to comprise two restriction sites making possible ligation of the adapter sequences (in the case of double-digest RAD) or performing size selection of the resulting fragments (in the case of single-digest RAD). Here, we address these limitations by presenting a novel method called hybridization RAD (hyRAD). In this approach, biotinylated RAD fragments, covering a random fraction of the genome, are used as baits for capturing homologous fragments from genomic shotgun sequencing libraries. This simple and cost-effective approach allows sequencing of orthologous loci even from highly degraded DNA samples, opening new avenues of research in the field of museum genomics. Not relying on the restriction site presence, it improves among-sample loci coverage. In a trial study, hyRAD allowed us to obtain a large set of orthologous loci from fresh and museum samples from a non-model butterfly species, with a high proportion of single nucleotide polymorphisms present in all eight analyzed specimens, including 58-year-old museum samples. The utility of the method was further validated using 49 museum and fresh samples of a Palearctic grasshopper species for which the spatial genetic structure was previously assessed using mtDNA amplicons. The application of the method is eventually discussed in a wider context. As it does not rely on the restriction site presence, it is therefore not sensitive to among-sample loci polymorphisms in the restriction sites that usually causes loci dropout. This should enable the application of hyRAD to analyses at broader evolutionary scales.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
In this paper we describe a browsing and searching personalization system for digitallibraries based on the use of ontologies for describing the relationships between all theelements which take part in a digital library scenario of use. The main goal of thisproject is to help the users of a digital library to improve their experience of use bymeans of two complementary strategies: first, by maintaining a complete history recordof his or her browsing and searching activities, which is part of a navigational userprofile which includes preferences and all the aspects related to community involvement; and second, by reusing all the knowledge which has been extracted from previous usage from other users with similar profiles. This can be accomplished in terms of narrowing and focusing the search results and browsing options through the use of a recommendation system which organizes such results in the most appropriatemanner, using ontologies and concepts drawn from the semantic web field. The complete integration of the experience of use of a digital library in the learning process is also pursued. Both the usage and information organization can be also exploited to extract useful knowledge from the way users interact with a digital library, knowledge that can be used to improve several design aspects of the library, ranging from internal organization aspects to human factors and user interfaces. Although this project is still on an early development stage, it is possible to identify all the desired functionalities and requirements that are necessary to fully integrate the use of a digital library in an e-learning environment.
Resumo:
The Kenyan forestry and sawmilling industry have been subject to a changing environment since 1999 when the industrial forest plantations were closed down. This has lowered raw material supply and it has affected and reduced the sawmill operations and the viability of the sawmill enterprises. The capacity of the 276 registered sawmills is not sufficient to fulfill sawn timber demand in Kenya. This is because of the technological degradation and lack of a qualified labor force, which were caused because of non-existent sawmilling education and further training in Kenya. Lack of competent sawmill workers has led to low raw material recovery, under utilization of resources and loss of employment. The objective of the work was to suggest models, methods and approaches for the competence and capacity development of the Kenyan sawmilling industry, sawmills and their workers. A nationwide field survey, interviews, questionnaire and literature review was used for data collection to find out the sawmills’ competence development areas and to suggest models and methods for their capacity building. The sampling frame included 22 sawmills that represented 72,5% of all the registered sawmills in Kenya. The results confirmed that the sawmills’ technological level was backwards, productivity low, raw material recovery unacceptable and workers’ professional education low. The future challenges will be how to establish the sawmills’ capacity building and workers’ competence development. Sawmilling industry development requires various actions through new development models and approaches. Activities should be started for technological development and workers’ competence development. This requires re-starting of vocational training in sawmilling and the establishment of more effective co-operation between the sawmills and their stakeholder groups. In competence development the Enterprise Competence Management Model of Nurminen (2007) can be used, whereas the best training model and approach would be a practically oriented learning at work model in which the short courses, technical assistance and extension services would be the key functions.
Resumo:
Aim: To investigate and understand patient's satisfaction with nursing care in the intensive care unit to identify the dimensions of the concept of"satisfaction" from the patient's point of view. To design and validate a questionnaire that measures satisfaction levels in critical patients. Background: There are many instruments capable of measuring satisfaction with nursing care; however, they do not address the reality for critical patients nor are they applicable in our context. Design: A dual approach study comprising: a qualitative phase employing Grounded Theory and a quantitative and descriptive phase to prepare and validate the questionnaire. Methods: Data collection in the qualitative phase will consist of: in-depth interview after theoretical sampling, on-site diary and expert discussion group. The sample size will depend on the expected theoretical saturation n = 27-36. Analysis will be based on Grounded Theory. For the quantitative phase, the sampling will be based on convenience (n = 200). A questionnaire will be designed on the basis of qualitative data. Descriptive and inferential statistics will be used. The validation will be developed on the basis of the validity of the content, the criteria of the construct and reliability of the instrument by the Cronbach's alpha and test-retest approach. Approval date for this protocol was November 2010. Discussion: Self-perceptions, beliefs, experiences, demographic, socio-cultural epistemological and political factors are determinants for satisfaction, and these should be taken into account when compiling a questionnaire on satisfaction with nursing care among critical patients.
Resumo:
The increasing incidence of type 1 diabetes has led researchers on a quest to find the reason behind this phenomenon. The rate of increase is too great to be caused simply by changes in the genetic component, and many environmental factors are under investigation for their possible contribution. These studies require, however, the participation of those individuals most likely to develop the disease, and the approach chosen by many is to screen vast populations to find persons with increased genetic risk factors. The participating individuals are then followed for signs of disease development, and their exposure to suspected environmental factors is studied. The main purpose of this study was to find a suitable tool for easy and inexpensive screening of certain genetic risk markers for type 1 diabetes. The method should be applicable to using whole blood dried on sample collection cards as sample material, since the shipping and storage of samples in this format is preferred. However, the screening of vast sample libraries of extracted genomic DNA should also be possible, if such a need should arise, for example, when studying the effect of newly discovered genetic risk markers. The method developed in this study is based on homogeneous assay chemistry and an asymmetrical polymerase chain reaction (PCR). The generated singlestranded PCR product is probed by lanthanide-labelled, LNA (locked nucleic acid)-spiked, short oligonucleotides with exact complementary sequences. In the case of a perfect match, the probe is hybridised to the product. However, if even a single nucleotide difference occurs, the probe is bound instead of the PCR product to a complementary quencher-oligonucleotide labelled with a dabcyl-moiety, causing the signal of the lanthanide label to be quenched. The method was applied to the screening of the well-known type 1 diabetes risk alleles of the HLA-DQB1 gene. The method was shown to be suitable as an initial screening step including thousands of samples in the scheme used in the TEDDY (The Environmental Determinants of Diabetes in the Young) study to identify those individuals at increased genetic risk. The method was further developed into dry-reagent form to allow an even simpler approach to screening. The reagents needed in the assay were in dry format in the reaction vessel, and performing the assay required only the addition of the sample and, if necessary, water to rehydrate the reagents. This allows the assay to be successfully executed even by a person with minimal laboratory experience.
Resumo:
The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.
Resumo:
The aim of this study was to develop a an automated bench top electronic penetrometer (ABEP) that allows performing tests with high rate of data acquisition (up to 19,600 Hz) and with variation of the displacement velocity and of the base area of cone penetration. The mechanical components of the ABEP are: a supporting structure, stepper motor, velocity reducer, double nut ball screw and six penetration probes. The electronic components of ABEP are: a "driver" to control rotation and displacement, power supply, three load cells, two software programs for running and storing data, and a data acquisition module. This penetrometer presented in compact size, portable and in 32 validation tests it proved easy to operate, and showed high resolution, high velocity in reliability in data collection. During the validation tests the equipment met the objectives, because the test results showed that the ABEP could use different sizes of cones, allowed work at different velocities, showed for velocity and displacement, were only 1.3% and 0.7%, respectively, at the highest velocity (30 mm s-1) and 1% and 0.9%, respectively for the lowest velocity (0.1 mm s-1).
Resumo:
Target company of this study is a large machinery company, which is, inter alia, engaged in energy and pulp engineering, procurement and construction management (EPCM) supply business. The main objective of this study was to develop cost estimation of the target company by providing more accurate, reliable and up-to-date information through enterprise resource planning (ERP) system. Another objective was to find cost-effective methods to collect total cost of ownership information to support more informed supplier selection decision making. This study is primarily action-oriented, but also constructive, and it can be divided in two sections: theoretical literature review and empirical study on the abovementioned part of the target company’s business. Development of information collection is, in addition to literature review, based on nearly 30 qualitative interviews of employees at various organizational units, functions and levels at the target company. At the core of development was to make initial data more accurate, reliable and available, a necessary prerequisite for informed use of the information. Certain development suggestions and paths were presented in order to regain confidence in ERP system as information source by reorganizing work breakdown structure and by complementing mere cost information with quantitative, technical and scope information. Several methods to use the information ever more effectively were also discussed. While implementation of the development suggestions outreached the scope of this study, it was forwarded in test environment and interest groups.
Resumo:
Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.