943 resultados para Web to print
Resumo:
Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.
Resumo:
The objective of this article is to systematically assess the quality of web-based information in French language on the alcohol dependence. The authors analysed, using a standardised pro forma, the 20 most highly ranked pages identified by 3 common internet search engines using 2 keywords. Results show that a total of 45 sites were analysed. The authors conclude that the overall quality of the sites was relatively poor, especially for the description of possible treatments, however with a wide variability. Content quality was not correlated with other aspects of quality such as interactivity, aesthetic or accountability.
Resumo:
Tutkimuksen tavoite oli selvittää yrityksen web toiminnan rakentamisen vaiheita sekä menestyksen mittaamista. Rakennusprosessia tutkittiin viisiportaisen askelmallin avulla. Mallin askeleet ovat; arviointi, strategian muotoilu, suunnitelma, pohjapiirros ja toteutus. Arviointi- ja toteutusvaiheiden täydentämiseksi sekä erityisesti myös internet toiminnan onnistumisen mittaamisen avuksi internet toiminnan hyödyt (CRM,kommunikointi-, myynti-, ja jakelukanava hyödyt markkinoinnin kannalta) käsiteltiin. Toiminnan menestyksen arvioinnin avuksi esiteltiin myös porrasmalli internet toimintaan. Porrasmalli määrittelee kauppakulissi-, dynaaminen-, transaktio- ja e-businessportaat. Tutkimuksessa löydettiin menestystekijöitä internet toimintojen menestykselle. Nämä tekijät ovat laadukas sisältö, kiinnostavuus, viihdyttävyys, informatiivisuus, ajankohtaisuus, personoitavuus, luottamus, interaktiivisuus, käytettävyys, kätevyys, lojaalisuus, suoriutuminen, responssiivisuus ja käyttäjätiedon kerääminen. Mittarit jaettiin tutkimuksessa aktiivisuus-, käyttäytymis- ja muunnosmittareihin. Lisäksi muita mittareita ja menestysindikaattoreita esiteltiin. Nämä menestyksen elementit ja mittarit koottiin yhteen uudessa internet toimintojen menestyksenarviointimallissa. Tutkielman empiirisessä osuudessa,esitettyjä teorioita peilattiin ABB:n (ABB:n sisällä erityisesti ABB Stotz-Kontakt) web toimintaan. Apuna olivat dokumenttianalyysi sekä haastattelut. Empiirinen osa havainnollisti teoriat käytännössä ja toi ilmi mahdollisuuden teorioiden laajentamiseen. Internet toimintojen rakentamismallia voidaan käyttää myös web toimintojen kehittämiseen ja porrasmalli sopii myös nykyisten internet toimintojen arvioimiseen. Mittareiden soveltaminen käytännössä toi kuitenkin ilmi tarpeen niiden kehittämiseen ja aiheen lisätutkimukseen. Niiden tulisi olla myös aiempaatiiviimmin liitetty kokonaisvaltaisen liiketoiminnan menestyksen mittaamiseen.
Resumo:
BACKGROUND: No previous studies have explored how closely women follow their psychotropic drug regimens during pregnancy. This study aimed to explore patterns of and factors associated with low adherence to psychotropic medication during pregnancy. METHODS: Multinational web-based study was performed in 18 countries in Europe, North America, and Australia. Uniform data collection was ensured via an electronic questionnaire. Pregnant women were eligible to participate. Adherence was measured via the 8-item Morisky Medication Adherence Scale (MMAS-8). The Beliefs about Prescribed Medicines Questionnaire (BMQ-specific), the Edinburgh Postnatal Depression Scale (EPDS), and a numeric rating scale were utilized to measure women's beliefs, depressive symptoms, and antidepressant risk perception, respectively. Participants reporting use of psychotropic medication during pregnancy (n = 160) were included in the analysis. RESULTS: On the basis of the MMAS-8, 78 of 160 women (48.8%, 95% CI: 41.1-56.4%) demonstrated low adherence during pregnancy. The rates of low adherence were 51.3% for medication for anxiety, 47.2% for depression, and 42.9% for other psychiatric disorders. Smoking during pregnancy, elevated antidepressant risk perception (risk≥6), and depressive symptoms were associated with a significant 3.9-, 2.3-, and 2.5-fold increased likelihood of low medication adherence, respectively. Women on psychotropic polytherapy were less likely to demonstrate low adherence. The belief that the benefit of pharmacotherapy outweighed the risks positively correlated (r = .282) with higher medication adherence. CONCLUSIONS: Approximately one of two pregnant women using psychotropic medication demonstrated low adherence in pregnancy. Life-style factors, risk perception, depressive symptoms, and individual beliefs are important factors related to adherence to psychotropic medication in pregnancy.
Resumo:
Tämän tutkimuksen tavoitteena oli tutkia langattomien internet palveluiden arvoverkkoa ja liiketoimintamalleja. Tutkimus oli luonteeltaan kvalitatiivinen ja siinä käytettiin strategiana konstruktiivista case-tutkimusta. Esimerkkipalveluna oli Treasure Hunters matkapuhelinpeli. Tutkimus muodostui teoreettisesta ja empiirisestä osasta. Teoriaosassa liitettiin innovaatio, liiketoimintamallit ja arvoverkko käsitteellisesti toisiinsa, sekä luotiin perusta liiketoimintamallien kehittämiselle. Empiirisessä osassa keskityttiin ensin liiketoimintamallien luomiseen kehitettyjen innovaatioiden pohjalta. Lopuksi pyrittiin määrittämään arvoverkko palvelun toteuttamiseksi. Tutkimusmenetelminä käytettiin innovaatiosessiota, haastatteluja ja lomakekyselyä. Tulosten pohjalta muodostettiin useita liiketoimintakonsepteja sekä kuvaus arvoverkon perusmallista langattomille peleille. Loppupäätelmänä todettiin että langattomat palvelut vaativat toteutuakseen useista toimijoista koostuvan arvoverkon.
Resumo:
BACKGROUND: Available methods to simulate nucleotide or amino acid data typically use Markov models to simulate each position independently. These approaches are not appropriate to assess the performance of combinatorial and probabilistic methods that look for coevolving positions in nucleotide or amino acid sequences. RESULTS: We have developed a web-based platform that gives a user-friendly access to two phylogenetic-based methods implementing the Coev model: the evaluation of coevolving scores and the simulation of coevolving positions. We have also extended the capabilities of the Coev model to allow for the generalization of the alphabet used in the Markov model, which can now analyse both nucleotide and amino acid data sets. The simulation of coevolving positions is novel and builds upon the developments of the Coev model. It allows user to simulate pairs of dependent nucleotide or amino acid positions. CONCLUSIONS: The main focus of our paper is the new simulation method we present for coevolving positions. The implementation of this method is embedded within the web platform Coev-web that is freely accessible at http://coev.vital-it.ch/, and was tested in most modern web browsers.
Resumo:
Occupational hygiene practitioners typically assess the risk posed by occupational exposure by comparing exposure measurements to regulatory occupational exposure limits (OELs). In most jurisdictions, OELs are only available for exposure by the inhalation pathway. Skin notations are used to indicate substances for which dermal exposure may lead to health effects. However, these notations are either present or absent and provide no indication of acceptable levels of exposure. Furthermore, the methodology and framework for assigning skin notation differ widely across jurisdictions resulting in inconsistencies in the substances that carry notations. The UPERCUT tool was developed in response to these limitations. It helps occupational health stakeholders to assess the hazard associated with dermal exposure to chemicals. UPERCUT integrates dermal quantitative structure-activity relationships (QSARs) and toxicological data to provide users with a skin hazard index called the dermal hazard ratio (DHR) for the substance and scenario of interest. The DHR is the ratio between the estimated 'received' dose and the 'acceptable' dose. The 'received' dose is estimated using physico-chemical data and information on the exposure scenario provided by the user (body parts exposure and exposure duration), and the 'acceptable' dose is estimated using inhalation OELs and toxicological data. The uncertainty surrounding the DHR is estimated with Monte Carlo simulation. Additional information on the selected substances includes intrinsic skin permeation potential of the substance and the existence of skin notations. UPERCUT is the only available tool that estimates the absorbed dose and compares this to an acceptable dose. In the absence of dermal OELs it provides a systematic and simple approach for screening dermal exposure scenarios for 1686 substances.
Resumo:
Purpose - This article describes the use of web services to interconnect the GTBib interlibrary loan program with the OCLC WorldShare platform. Design/methodology/approach - We describe the current problem of duplication of procedures in libraries that have added their collections to the OCLC WorldCat catalogue in recent years and are therefore more likely to receive interlibrary loan requests through the WorldShare Platform. Findings - A solution that uses web services to insert and retrieve requests between the two systems is presented. Autonomous agents periodically check the status of the requests and keep them updated and synchronized. These agents also inform the library staff of any variation or inconsistency that is detected. Practical Implications - This technology reduces process management time by making it unnecessary to introduce the request data in both systems. Agents are used to check the consistency of statuses between the two systems, thus avoiding errors and omissions and improving the efficiency of the whole interlibrary loan process. Originality/value - This paper describes in detail the technical aspects of the solution as a reference for the development of future applications.
Resumo:
We present here the first part of the literature review regarding our study object, the Open Device Labs. The research on ODLs emerges from the observation of worldwide non-profit movement, which, through mutual collaboration, information and devices sharing, proposes a final improvement on user’s experience with the web and app.
Resumo:
Browsing the web has become one of the most important features in high end mobile phones and in the future more and more people will be using mobile phone for web browsing. Large touchscreens improve browsing experience but many web sites are designed to be used with a mouse. A touchscreen differs substantially from a mouse as a pointing device and therefore mouse emulation logic is required in the browsers to make more web sites usable. This Master's thesis lists the most significant cases where the differences of a mouse and a touchscreen affect web browsing. Five touchscreen mobile phones and their web browsers were evaluated to find out if and how these cases are handled in them. Also as a part of this thesis, a simple QtWebKit based mobile web browser with advanced mouse emulation model was implemented, aiming to solve all the problematic cases. The conclusion of this work is that it is feasible to emulate a mouse with a touchscreen and thus deliver good user experience in mobile web browsing. However, current highend touchscreen mobile phones have relatively underdeveloped mouse emulations in their web browsers and there is a lot to improve.
Resumo:
Print quality and the printability of paper are very important attributes when modern printing applications are considered. In prints containing images, high print quality is a basic requirement. Tone unevenness and non uniform glossiness of printed products are the most disturbing factors influencing overall print quality. These defects are caused by non ideal interactions of paper, ink and printing devices in high speed printing processes. Since print quality is a perceptive characteristic, the measurement of unevenness according to human vision is a significant problem. In this thesis, the mottling phenomenon is studied. Mottling is a printing defect characterized by a spotty, non uniform appearance in solid printed areas. Print mottle is usually the result of uneven ink lay down or non uniform ink absorption across the paper surface, especially visible in mid tone imagery or areas of uniform color, such as solids and continuous tone screen builds. By using existing knowledge on visual perception and known methods to quantify print tone variation, a new method for print unevenness evaluation is introduced. The method is compared to previous results in the field and is supported by psychometric experiments. Pilot studies are made to estimate the effect of optical paper characteristics prior to printing, on the unevenness of the printed area after printing. Instrumental methods for print unevenness evaluation have been compared and the results of the comparison indicate that the proposed method produces better results in terms of visual evaluation correspondence. The method has been successfully implemented as ail industrial application and is proved to be a reliable substitute to visual expertise.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
With small and medium sized-enterprises (SMEs) taking up the majority of the global businesses, it is important they act in an environmentally responsible manner. Environmental management systems (EMS) help companies evaluate and improve their environmental impact but they often require human, financial, and temporary resources that not all SMEs can provide. This research encompasses interviews with representatives of two small enterprises in Germany to provide insights into their understanding, and knowledge of an EMS and how they perceive their responsibility towards the environment. Furthermore, it presents a toolkit created especially for small and medium-sized enterprises that serves as a simplified version of an EMS based on the ISO 14001 standard and is evaluated by the representatives of the SMEs. Some of the findings are: while being open to the idea of improving their environmental impact, SMEs do not always feel it is their responsibility to do so; they seem to lack the means to fully implement an EMS. The developed toolkit is considered useful and usable and recommendations are drawn for its future enhancement.
Resumo:
With the growth in new technologies, using online tools have become an everyday lifestyle. It has a greater impact on researchers as the data obtained from various experiments needs to be analyzed and knowledge of programming has become mandatory even for pure biologists. Hence, VTT came up with a new tool, R Executables (REX) which is a web application designed to provide a graphical interface for biological data functions like Image analysis, Gene expression data analysis, plotting, disease and control studies etc., which employs R functions to provide results. REX provides a user interactive application for the biologists to directly enter the values and run the required analysis with a single click. The program processes the given data in the background and prints results rapidly. Due to growth of data and load on server, the interface has gained problems concerning time consumption, poor GUI, data storage issues, security, minimal user interactive experience and crashes with large amount of data. This thesis handles the methods by which these problems were resolved and made REX a better application for the future. The old REX was developed using Python Django and now, a new programming language, Vaadin has been implemented. Vaadin is a Java framework for developing web applications and the programming language is extremely similar to Java with new rich components. Vaadin provides better security, better speed, good and interactive interface. In this thesis, subset functionalities of REX was selected which includes IST bulk plotting and image segmentation and implemented those using Vaadin. A code of 662 lines was programmed by me which included Vaadin as the front-end handler while R language was used for back-end data retrieval, computing and plotting. The application is optimized to allow further functionalities to be migrated with ease from old REX. Future development is focused on including Hight throughput screening functions along with gene expression database handling
Resumo:
Highly dynamic systems, often considered as resilient systems, are characterised by abiotic and biotic processes under continuous and strong changes in space and time. Because of this variability, the detection of overlapping anthropogenic stress is challenging. Coastal areas harbour dynamic ecosystems in the form of open sandy beaches, which cover the vast majority of the world’s ice-free coastline. These ecosystems are currently threatened by increasing human-induced pressure, among which mass-development of opportunistic macroalgae (mainly composed of Chlorophyta, so called green tides), resulting from the eutrophication of coastal waters. The ecological impact of opportunistic macroalgal blooms (green tides, and blooms formed by other opportunistic taxa), has long been evaluated within sheltered and non-tidal ecosystems. Little is known, however, on how more dynamic ecosystems, such as open macrotidal sandy beaches, respond to such stress. This thesis assesses the effects of anthropogenic stress on the structure and the functioning of highly dynamic ecosystems using sandy beaches impacted by green tides as a study case. The thesis is based on four field studies, which analyse natural sandy sediment benthic community dynamics over several temporal (from month to multi-year) and spatial (from local to regional) scales. In this thesis, I report long-lasting responses of sandy beach benthic invertebrate communities to green tides, across thousands of kilometres and over seven years; and highlight more pronounced responses of zoobenthos living in exposed sandy beaches compared to semi-exposed sands. Within exposed sandy sediments, and across a vertical scale (from inshore to nearshore sandy habitats), I also demonstrate that the effects of the presence of algal mats on intertidal benthic invertebrate communities is more pronounced than that on subtidal benthic invertebrate assemblages, but also than on flatfish communities. Focussing on small-scale variations in the most affected faunal group (i.e. benthic invertebrates living at low shore), this thesis reveals a decrease in overall beta-diversity along a eutrophication-gradient manifested in the form of green tides, as well as the increasing importance of biological variables in explaining ecological variability of sandy beach macrobenthic assemblages along the same gradient. To illustrate the processes associated with the structural shifts observed where green tides occurred, I investigated the effects of high biomasses of opportunistic macroalgae (Ulva spp.) on the trophic structure and functioning of sandy beaches. This work reveals a progressive simplification of sandy beach food web structure and a modification of energy pathways over time, through direct and indirect effects of Ulva mats on several trophic levels. Through this thesis I demonstrate that highly dynamic systems respond differently (e.g. shift in δ13C, not in δ15N) and more subtly (e.g. no mass-mortality in benthos was found) to anthropogenic stress compared to what has been previously shown within more sheltered and non-tidal systems. Obtaining these results would not have been possible without the approach used through this work; I thus present a framework coupling field investigations with analytical approaches to describe shifts in highly variable ecosystems under human-induced stress.