994 resultados para open innovation
Resumo:
Background: Some countries have recently extended smoke-free policies to particular outdoor settings; however, there is controversy regarding whether this is scientifically and ethically justifiable. Objectives: The objective of the present study was to review research on secondhand smoke (SHS) exposure in outdoor settings. Data sources: We conducted different searches in PubMed for the period prior to September 2012. We checked the references of the identified papers, and conducted a similar search in Google Scholar. Study selection: Our search terms included combinations of"secondhand smoke,""environmental tobacco smoke,""passive smoking" OR"tobacco smoke pollution" AND"outdoors" AND"PM" (particulate matter),"PM2.5" (PM with diameter ≤ 2.5 µm),"respirable suspended particles,""particulate matter,""nicotine,""CO" (carbon monoxide),"cotinine,""marker,""biomarker" OR"airborne marker." In total, 18 articles and reports met the inclusion criteria. Results: Almost all studies used PM2.5 concentration as an SHS marker. Mean PM2.5 concentrations reported for outdoor smoking areas when smokers were present ranged from 8.32 to 124 µg/m3 at hospitality venues, and 4.60 to 17.80 µg/m3 at other locations. Mean PM2.5 concentrations in smoke-free indoor settings near outdoor smoking areas ranged from 4 to 120.51 µg/m3. SHS levels increased when smokers were present, and outdoor and indoor SHS levels were related. Most studies reported a positive association between SHS measures and smoker density, enclosure of outdoor locations, wind conditions, and proximity to smokers. Conclusions: The available evidence indicates high SHS levels at some outdoor smoking areas and at adjacent smoke-free indoor areas. Further research and standardization of methodology is needed to determine whether smoke-free legislation should be extended to outdoor settings.
Resumo:
Peliteollisuus on nykyään erittäin suuri ohjelmistokehityksen ala, joten on ajankohtaista tutustua ilmaisten työkalujen ja kirjastojen tarjoamiin mahdollisuuksiin. Visuaalisen viihteen tuottamiseen tarvitaan yleensä C++-ohjelmointitaidon lisäksi mallinnustaitoa ja kuvankäsittelytaitoa. Tämän lisäksi äänten tuottaminen on erittäin suuri osa toimivan kokonaisuuden saavuttamiseksi. Tässä työssä käsitellään kaikki osa-alueet ja tutkitaan Open Source -työkalujen soveltuvuutta pelin kehitykseen win32-alustalla. Lopputuloksena syntyy täysin pelattava, tosin yksinkertainen peli CrazyBunny. Työn alussa esitellään kaikki käytettävät työkalut jotka kuuluvat tarvittavaan kehitysympäristöön. Tähän esittelyyn kuuluvat myös olennaisena osana työkalujen asennuksen läpikäynti sekä käyttöönotto. Työn perustana on käytetty OGRE-ohjelmistokehystä, joka ei ole varsinainen pelimoottori. Puuttuvia ominaisuuksia on lisätty käyttämällä CEGUI-kirjastoa käyttöliittymien tekoon sekä FMOD-kirjastoa äänijärjestelmän toteutukseen. Muita käytet-tyjä työkaluja ovat Code::Blocks-kehitysympäristö, Blender-mallinnusohjelma ja Audacity-äänieditori. Pelisovelluksen toteutuksen pohjana on käytetty State-sunnittelumalliin perustuvaa järjes-telmää pelitiloja hallintaan. Tässä mallissa pelin päävalikko, pelitila ja pelin loppu on ero-tettu omiksi tilaluokikseen, jolloin sovelluksesta saadaan helpommin hallittava. Päävali-kossa tärkein osa on itse valikoiden toteutus CEGUI-kirjaston avulla. Pelitilan toteutukses-sa tutustutaan OGRE:n visuaalisiin ominaisuuksiin kuten ympäristöön, valoihin, varjoihin, kuva-alustoihin ja visuaalisiin tehosteisiin. Tämän lisäksi peliin on toteutettu äänet suosi-tulla FMOD-kirjastolla, jota useat isot alan yritykset käyttävät kaupallisissa tuotteissaan.
Resumo:
The main objective of this dissertation is to create new knowledge on an administrative innovation, its adoption, diffusion and finally its effectiveness. In this dissertation the administrative innovation is approached through a widely utilized management philosophy, namely the total quality management (TQM) strategy. TQM operationalizes a self-assessment procedure, which is based on continual improvement principles and measuring the improvements. This dissertation also captures the theme of change management as it analyzes the adoption and diffusion of the administrative innovation. It identifies innovation characteristics as well as organisational and individual factors explaining the adoption and implementation. As a special feature, this study also explores the effectiveness of the innovation based on objective data. For studying the administrative innovation (TQM model), a multinational Case Company provides a versatile ground for a deep, longitudinal analysis. The Case Company started the adoption systematically in the mid 1980s in some of its units. As part of their strategic planning today, the procedure is in use throughout the entire global company. The empirical story begins from the innovation adoption decision that was made in the Case Company over 22 years ago. In order to be able to capture the right atmosphere and backgrounds leading to the adoption decision, key informants from that time were interviewed, since the main target was to clarify the dynamics of how an administrative innovation develops. In addition, archival material was collected and studied, available memos and data relating to the innovation, innovation adoption and later to the implementation contained altogether 20500 pages of documents. A survey was furthermore conducted at the end of 2006 focusing on questions related to the innovation, organization and leadership characteristics and the response rate totalled up to 54%. For measuring the effectiveness of the innovation implementation, the needed longitudinal objective performance data was collected. This data included the profit unit level experience of TQM, the development of the self assessment scores per profit unit and performance data per profit unit measured with profitability, productivity and customer satisfaction. The data covered the years 1995-2006. As a result, the prerequisites for the successful adoption of an administrative innovation were defined, such as the top management involvement, support of the change agents and effective tools for implementation and measurement. The factors with the greatest effect on the depth of the implementation were the timing of the adoption and formalization. The results also indicated that the TQM model does have an effect on the company performance measured with profitability, productivity and customer satisfaction. Consequently this thesis contributes to the present literature (i) by taking into its scope an administrative innovation and focusing on the whole innovation implementation process, from the adoption, through diffusion until its consequences, (ii) because the studied factors with an effect on the innovation adoption and diffusion are multifaceted and grouped into individual, organizational and environmental factors, and a strong emphasis is put on the role of the individual change agents and (iii) by measuring the depth and consistency of the administrative innovation. This deep analysis was possible due to the availability of longitudinal data with triangulation possibilities.
Resumo:
The aim of the present study was to elicit how patients with delusions with religious contents conceptualized or experienced their spirituality and religiousness. Sixty-two patients with present or past religious delusions went through semistructured interviews, which were analyzed using the three coding steps described in the grounded theory. Three major themes were found in religious delusions: ''spiritual identity,'' ''meaning of illness,'' and ''spiritual figures.'' One higher-order concept was found: ''structure of beliefs.'' We identified dynamics that put these personal beliefs into a constant reconstruction through interaction with the world and others (i.e., open dynamics) and conversely structural dynamics that created a complete rupture with the surrounding world and others (i.e., closed structural dynamics); those dynamics may coexist. These analyses may help to identify psychological functions of delusions with religious content and, therefore, to better conceptualize interventions when dealing with it in psychotherapy.
Resumo:
Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.
Resumo:
BACKGROUND: The management of unresectable metastatic colorectal cancer (mCRC) is a comprehensive treatment strategy involving several lines of therapy, maintenance, salvage surgery, and treatment-free intervals. Besides chemotherapy (fluoropyrimidine, oxaliplatin, irinotecan), molecular-targeted agents such as anti-angiogenic agents (bevacizumab, aflibercept, regorafenib) and anti-epidermal growth factor receptor agents (cetuximab, panitumumab) have become available. Ultimately, given the increasing cost of new active compounds, new strategy trials are needed to define the optimal use and the best sequencing of these agents. Such new clinical trials require alternative endpoints that can capture the effect of several treatment lines and be measured earlier than overall survival to help shorten the duration and reduce the size and cost of trials. METHODS/DESIGN: STRATEGIC-1 is an international, open-label, randomized, multicenter phase III trial designed to determine an optimally personalized treatment sequence of the available treatment modalities in patients with unresectable RAS wild-type mCRC. Two standard treatment strategies are compared: first-line FOLFIRI-cetuximab, followed by oxaliplatin-based second-line chemotherapy with bevacizumab (Arm A) vs. first-line OPTIMOX-bevacizumab, followed by irinotecan-based second-line chemotherapy with bevacizumab, and by an anti-epidermal growth factor receptor monoclonal antibody with or without irinotecan as third-line treatment (Arm B). The primary endpoint is duration of disease control. A total of 500 patients will be randomized in a 1:1 ratio to one of the two treatment strategies. DISCUSSION: The STRATEGIC-1 trial is designed to give global information on the therapeutic sequences in patients with unresectable RAS wild-type mCRC that in turn is likely to have a significant impact on the management of this patient population. The trial is open for inclusion since August 2013. TRIAL REGISTRATION: STRATEGIC-1 is registered at Clinicaltrials.gov: NCT01910610, 23 July, 2013. STRATEGIC-1 is registered at EudraCT-No.: 2013-001928-19, 25 April, 2013.
Resumo:
Since the first implantation of an endograft in 1991, endovascular aneurysm repair (EVAR) rapidly gained recognition. Historical trials showed lower early mortality rates but these results were not maintained beyond 4 years. Despite newer-generation devices, higher rates of reintervention are associated with EVAR during follow-up. Therefore, the best therapeutic decision relies on many parameters that the physician has to take in consideration. Patient's preferences and characteristics are important, especially age and life expectancy besides health status. Aneurysmal anatomical conditions remain probably the most predictive factor that should be carefully evaluated to offer the best treatment. Unfavorable anatomy has been observed to be associated with more complications especially endoleak, leading to more re-interventions and higher risk of late mortality. Nevertheless, technological advances have made surgeons move forward beyond the set barriers. Thus, more endografts are implanted outside the instructions for use despite excellent results after open repair especially in low-risk patients. When debating about AAA repair, some other crucial points should be analysed. It has been shown that strict surveillance is mandatory after EVAR to offer durable results and prevent late rupture. Such program is associated with additional costs and with increased risk of radiation. Moreover, a risk of loss of renal function exists when repetitive imaging and secondary procedures are required. The aim of this article is to review the data associated with abdominal aortic aneurysm and its treatment in order to establish selection criteria to decide between open or endovascular repair.
Resumo:
The movement for open access to science seeks to achieve unrestricted and free access to academic publications on the Internet. To this end, two mechanisms have been established: the gold road, in which scientific journals are openly accessible, and the green road, in which publications are self-archived in repositories. The publication of the Finch Report in 2012, advocating exclusively the adoption of the gold road, generated a debate as to whether either of the two options should be prioritized. The recommendations of the Finch Report stirred controversy among academicians specialized in open access issues, who felt that the role played by repositories was not adequately considered and because the green road places the burden of publishing costs basically on authors. The Finch Report"s conclusions are compatible with the characteristics of science communication in the UK and they could surely also be applied to the (few) countries with a powerful publishing industry and substantial research funding. In Spain, both the current national legislation and the existing rules at universities largely advocate the green road. This is directly related to the structure of scientific communication in Spain, where many journals have little commercial significance, the system of charging a fee to authors has not been adopted, and there is a good repository infrastructure. As for open access policies, the performance of the scientific communication system in each country should be carefully analyzed to determine the most suitable open access strategy. [Int Microbiol 2013; 16(3):199-203]
Resumo:
This paper contains a joint ESHG/ASHG position document with recommendations regarding responsible innovation in prenatal screening with non-invasive prenatal testing (NIPT). By virtue of its greater accuracy and safety with respect to prenatal screening for common autosomal aneuploidies, NIPT has the potential of helping the practice better achieve its aim of facilitating autonomous reproductive choices, provided that balanced pretest information and non-directive counseling are available as part of the screening offer. Depending on the health-care setting, different scenarios for NIPT-based screening for common autosomal aneuploidies are possible. The trade-offs involved in these scenarios should be assessed in light of the aim of screening, the balance of benefits and burdens for pregnant women and their partners and considerations of cost-effectiveness and justice. With improving screening technologies and decreasing costs of sequencing and analysis, it will become possible in the near future to significantly expand the scope of prenatal screening beyond common autosomal aneuploidies. Commercial providers have already begun expanding their tests to include sex-chromosomal abnormalities and microdeletions. However, multiple false positives may undermine the main achievement of NIPT in the context of prenatal screening: the significant reduction of the invasive testing rate. This document argues for a cautious expansion of the scope of prenatal screening to serious congenital and childhood disorders, only following sound validation studies and a comprehensive evaluation of all relevant aspects. A further core message of this document is that in countries where prenatal screening is offered as a public health programme, governments and public health authorities should adopt an active role to ensure the responsible innovation of prenatal screening on the basis of ethical principles. Crucial elements are the quality of the screening process as a whole (including non-laboratory aspects such as information and counseling), education of professionals, systematic evaluation of all aspects of prenatal screening, development of better evaluation tools in the light of the aim of the practice, accountability to all stakeholders including children born from screened pregnancies and persons living with the conditions targeted in prenatal screening and promotion of equity of access.
Resumo:
Background: Information about the composition of regulatory regions is of great value for designing experiments to functionally characterize gene expression. The multiplicity of available applications to predict transcription factor binding sites in a particular locus contrasts with the substantial computational expertise that is demanded to manipulate them, which may constitute a potential barrier for the experimental community. Results: CBS (Conserved regulatory Binding Sites, http://compfly.bio.ub.es/CBS) is a public platform of evolutionarily conserved binding sites and enhancers predicted in multiple Drosophila genomes that is furnished with published chromatin signatures associated to transcriptionally active regions and other experimental sources of information. The rapid access to this novel body of knowledge through a user-friendly web interface enables non-expert users to identify the binding sequences available for any particular gene, transcription factor, or genome region. Conclusions: The CBS platform is a powerful resource that provides tools for data mining individual sequences and groups of co-expressed genes with epigenomics information to conduct regulatory screenings in Drosophila.
Resumo:
BACKGROUND: The primary analysis of the FLAMINGO study at 48 weeks showed that patients taking dolutegravir once daily had a significantly higher virological response rate than did those taking ritonavir-boosted darunavir once daily, with similar tolerability. We present secondary efficacy and safety results analysed at 96 weeks. METHODS: FLAMINGO was a multicentre, open-label, phase 3b, non-inferiority study of HIV-1-infected treatment-naive adults. Patients were randomly assigned (1:1) to dolutegravir 50 mg or darunavir 800 mg plus ritonavir 100 mg, with investigator-selected combination tenofovir and emtricitabine or combination abacavir and lamivudine background treatment. The main endpoints were plasma HIV-1 RNA less than 50 copies per mL and safety. The non-inferiority margin was -12%. If the lower end of the 95% CI was greater than 0%, then we concluded that dolutegravir was superior to ritonavir-boosted darunavir. This trial is registered with ClinicalTrials.gov, number NCT01449929. FINDINGS: Of 595 patients screened, 488 were randomly assigned and 484 included in the analysis (242 assigned to receive dolutegravir and 242 assigned to receive ritonavir-boosted darunavir). At 96 weeks, 194 (80%) of 242 patients in the dolutegravir group and 164 (68%) of 242 in the ritonavir-boosted darunavir group had HIV-1 RNA less than 50 copies per mL (adjusted difference 12·4, 95% CI 4·7-20·2; p=0·002), with the greatest difference in patients with high viral load at baseline (50/61 [82%] vs 32/61 [52%], homogeneity test p=0·014). Six participants (three since 48 weeks) in the dolutegravir group and 13 (four) in the darunavir plus ritonavir group discontinued because of adverse events. The most common drug-related adverse events were diarrhoea (23/242 [10%] in the dolutegravir group vs 57/242 [24%] in the darunavir plus ritonavir group), nausea (31/242 [13%] vs 34/242 [14%]), and headache (17/242 [7%] vs 12/242 [5%]). INTERPRETATION: Once-daily dolutegravir is associated with a higher virological response rate than is once-daily ritonavir-boosted darunavir. Dolutegravir compares favourably in efficacy and safety to a boosted darunavir regimen with nucleoside reverse transcriptase inhibitor background treatment for HIV-1-infected treatment-naive patients. FUNDING: ViiV Healthcare and Shionogi & Co.
Resumo:
Innovation is the word of this decade. According to innovation definitions, without positive sales impact and meaningful market share the company’s product or service has not been an innovation. Research problem of this master thesis is to find out what is the innovation process of complex new consumer products and services in new innovation paradigm. The objective is to get answers to two research questions: 1) What are the critical success factors what company should do when it is implementing the paradigm change in mass markets consumer business with complex products and services? 2) What is the process or framework one firm could follow? The research problem is looked from one company’s innovation creation process, networking and organization change management challenges point of views. Special focus is to look the research problem from an existing company perspective which is entering new business area. Innovation process management framework of complex new consumer products and services in new innovation paradigm has been created with support of several existing innovation theories. The new process framework includes the critical innovation process elements companies should take into consideration in their daily activities when they are in their new business innovation implementing process. Case company location based business implementation activities are studied via the new innovation process framework. This case study showed how important it is to manage the process, look how the target market and the competition in it is developing during company’s own innovation process, make decisions at right time and from beginning plan and implement the organization change management as one activity in the innovation process. In the end this master thesis showed that all companies need to create their own innovation process master plan with milestones and activities. One plan does not fit all, but all companies can start their planning from the new innovation process what was introduced in this master thesis.