987 resultados para common pool resource
Resumo:
The β site APP cleaving enzyme 1 (BACE1) is the rate-limiting β-secretase enzyme in the amyloidogenic processing of APP and Aβ formation, and therefore it has a prominent role in Alzheimer"s disease (AD) pathology. Recent evidence suggests that the prion protein (PrP) interacts directly with BACE1 regulating its β-secretase activity. Moreover, PrP has been proposed as the cellular receptor involved in the impairment of synaptic plasticity and toxicity caused by Aβ oligomers. Provided that common pathophysiologic mechanisms are shared by Alzheimer"s and Creutzfeldt-Jakob (CJD) diseases, we investigated for the first time to the best of our knowledge a possible association of a common synonymous BACE1 polymorphism (rs638405) with sporadic CJD (sCJD). Our results indicate that BACE1 C-allele is associated with an increased risk for developing sCJD, mainly in PRNP M129M homozygous subjects with early onset. These results extend the very short list of genes (other than PRNP) involved in the development of human prion diseases; and support the notion that similar to AD, in sCJD several loci may contribute with modest overall effects to disease risk. These findings underscore the interplay in both pathologies of APP, Aβ oligomers, ApoE, PrP and BACE1, and suggest that aging and perhaps vascular risk factors may modulate disease pathologies in part through these key players
Resumo:
Työn tarkoituksena oli tutkia yrityksen eri yksiköiden toiminnaohjausjärjestelmiä sekä verrata niiden menestystekijöitä Adelakunin malliin tietojärjestelmien laatuulottuvuuksista. Siinä järjestelmän kokonaislaatu jaetaan liiketoiminnalliseen, tekniseen ja käyttäjän kokemaan laatuun. Tulosten perusteella oli myös tavoitteena kehittää kyseisen toiminnanojausjärjestelmän kehittämistä varten malli onnistumistekijöiden keskinäisestä riippuvuudesta. Tutkittavista järjestelmistä ja niiden käytöstä kerättiin tietoja käyttöönottoprojektien dokumentaatiosta, haastatteluin, kyselylomakkein ja järjestelmäanalyysein. Sekä loppukäyttäjät että yritysjohto olivat kyselyjen ja haastattelujen kohderyhmänä. Saatuja tietoja arvioitiin Adelakunin kolmiulotteisen tietojärjestelmän laatutekijämallin mukaisesti ja keskeisiä menestystekijöitä etsittiin. Tutkituissa tapauksissa tietojärjestelmien menestyksen taustalta löytyi alan kirjallisuuden kanssa yhtäpitäviä tekijöitä. Myös Adelakunin laatu-ulottuvuusmalli osoittautui validiksi tutkituissa tapauksissa. Keskeisten menestystekijöiden välisistä vuorovaikutussuhteista rakennettiin malli, jota voidaan hyödyntää kyseisen järjestelmän jatkokehityksessä.
Resumo:
Tutkielmassa selvitetään pienten ja keskisuurten yritysten avainhenkilöihin keskittyneen tietopääoman katoamisen riskiä ja pyritään löytämään keinoja riskin hallintaan. Pk-yrityksissä yrityksen kannalta kriittinen tieto on usein harvoihin henkilöihin keskittynyttä. Tällaisten avainhenkilöiden työpanoksen ja osaamisen menettäminen kokonaan saattaa olla yrityksen toiminnalle kohtalokasta. Väliaikainen osaamisen menettäminen voi myös vaikeuttaa toimintoja. Tietopääoman katoamisen riskin pienentäminen onnistuu joko avainhenkilön tiedon siirtämisellä toisiin henkilöihin tai huolehtimalla avainhenkilön työsuhteen mahdollisimman pitkästä jatkumisesta. Tiedon siirtämisen menetelmät riippuvat tiedon laadusta: onko se hiljaista vai eksplisiittistä. Avainhenkilön pysyvyyteen vaikuttaa yrityksen henkilöstöpolitiikka, jonka mukaisesti työntekijät saavat korvauksen työstään ja sitoutuvat yritykseen. Myös työntekijän työkyvystä huolehtiminen vaikuttaa tiedon pysymiseen yrityksen käytössä. Esimerkkiyrityksen haastatteluissa voidaan todeta yhtymäkohtia tutkimuksen muissa osissa käsiteltyihin tietojohtamisen ongelmiin.
Resumo:
Tutkimuksen tavoitteena oli selvittää MRO-tuotteiden hankinnassa käytettäviä liiketoimintasuhdemuotoja sekä huomioitavia asioita siirryttäessä kohti yhteistyötä toimittajan kanssa. Tutkimus toteutettiin kvalitatiivisena case-tutkimuksena, jossa aineiston kokoaminen pohjautui haastatteluihin, sisäisiin dokumentaatioihin sekä osallistuvaan havainnointiin. Analysointi tapahtui teoreettisen tuoteluokittelun pohjalta sekä luokitteluryhmien tarkastelulla käytännössä. Tutkimuksen keskeisimpänä tuloksena on havainto yhteistyösuhteiden käytön lisääntymisestä vakiotuotteiden hankinnassa. Tämä johtuu pyrkimyksestä suorittaa ko. tuotteiden hankinta mahdollisimman vähin resurssein, jolloin hankintojen huomio voidaan keskittää kriittisempiin tuotteisiin. MRO-tuotteissa käytettäviä yleisimpiä liiketoimintasuhteita ovat kilpailutus sekä vuosi- ja puitesopimukset. Ylläpitosopimukset ja kumppanuus-suhteet ovat mahdollisia, kun tuotteiden strateginen merkitys nousee merkittäväksi ja osapuolten välillä vallitsee korkea luottamus.
Resumo:
Monte Carlo simulations were used to generate data for ABAB designs of different lengths. The points of change in phase are randomly determined before gathering behaviour measurements, which allows the use of a randomization test as an analytic technique. Data simulation and analysis can be based either on data-division-specific or on common distributions. Following one method or another affects the results obtained after the randomization test has been applied. Therefore, the goal of the study was to examine these effects in more detail. The discrepancies in these approaches are obvious when data with zero treatment effect are considered and such approaches have implications for statistical power studies. Data-division-specific distributions provide more detailed information about the performance of the statistical technique.
Resumo:
Accurate prediction of mortality following burns is useful as an audit tool, and for providing treatment plan and resource allocation criteria. Common burn formulae (Ryan Score, Abbreviated Burn Severity Index (ABSI), classic and revised Baux) have not been compared with the standard Acute Physiology and Chronic Health Evaluation II (APACHEII) or re-validated in a severely (≥20% total burn surface area) burned population. Furthermore, the revised Baux (R-Baux) has been externally validated thoroughly only once and the pediatric Baux (P-Baux) has yet to be. Using 522 severely burned patients, we show that burn formulae (ABSI, Baux, revised Baux) outperform APACHEII among adults (AUROC increase p<0.001 adults; p>0.5 children). The Ryan Score performs well especially among the most at-risk populations (estimated mortality [90% CI] original versus current study: 33% [26-41%] versus 30.18% [24.25-36.86%] for Ryan Score 2; 87% [78-93%] versus 66.48% [51.31-78.87%] for Ryan Score 3). The R-Baux shows accurate discrimination (AUROC 0.908 [0.869-0.947]) and is well-calibrated. However, the ABSI and P-Baux, although showing high measures of discrimination (AUROC 0.826 [0.737-0.916] and 0.848 [0.758-0.938]) in children), exceedingly overestimates mortality, indicating poor calibration. We highlight challenges in designing and employing scores that are applicable to a wide range of populations.
Resumo:
Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.
Resumo:
Article About the Authors Metrics Comments Related Content Abstract Introduction Functionality Implementation Discussion Acknowledgments Author Contributions References Reader Comments (0) Figures Abstract Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.
Resumo:
OBJECTIVE: The natural course of chronic hepatitis C varies widely. To improve the profiling of patients at risk of developing advanced liver disease, we assessed the relative contribution of factors for liver fibrosis progression in hepatitis C. DESIGN: We analysed 1461 patients with chronic hepatitis C with an estimated date of infection and at least one liver biopsy. Risk factors for accelerated fibrosis progression rate (FPR), defined as ≥0.13 Metavir fibrosis units per year, were identified by logistic regression. Examined factors included age at infection, sex, route of infection, HCV genotype, body mass index (BMI), significant alcohol drinking (≥20 g/day for ≥5 years), HIV coinfection and diabetes. In a subgroup of 575 patients, we assessed the impact of single nucleotide polymorphisms previously associated with fibrosis progression in genome-wide association studies. Results were expressed as attributable fraction (AF) of risk for accelerated FPR. RESULTS: Age at infection (AF 28.7%), sex (AF 8.2%), route of infection (AF 16.5%) and HCV genotype (AF 7.9%) contributed to accelerated FPR in the Swiss Hepatitis C Cohort Study, whereas significant alcohol drinking, anti-HIV, diabetes and BMI did not. In genotyped patients, variants at rs9380516 (TULP1), rs738409 (PNPLA3), rs4374383 (MERTK) (AF 19.2%) and rs910049 (major histocompatibility complex region) significantly added to the risk of accelerated FPR. Results were replicated in three additional independent cohorts, and a meta-analysis confirmed the role of age at infection, sex, route of infection, HCV genotype, rs738409, rs4374383 and rs910049 in accelerating FPR. CONCLUSIONS: Most factors accelerating liver fibrosis progression in chronic hepatitis C are unmodifiable.
Resumo:
Macroinvertebrates associated to reed-beds (Phragmites australis) in six shallow natural water bodies along the 220 km of coast of the Comunidad Valenciana (Spain) were studied. These sites were selected to reflect different trophic states, but also, and due to the natural variability of mediterranean wetlands, they greatly differ in salinity and hydroperiod. To unify the sampling, reed bed was chosen to provide data from a habitat common to all wetlands, including the most eutrophic ones where submerged macrophytes have disappeared due to water turbidity. Individual submerged stems of Phragmites australis were sampled along with the surrounding water. The animal density found refers to the available stem surface area for colonization. Forty-one taxa were recorded in total, finding Chironomidae to be the most important group, quantitatively and qualitatively. In freshwater sites it was observed an increase in macroinvertebrate"s density at higher trophic states. Nevertheless each studied region had a different fauna. The PCA analysis with macroinvertebrate groups distinguished three types of environment: freshwaters (characterized by swimming insect larvae, collectors and predators, oligochaetes and Orthocladiinae), saline waters (characterized by crustaceans and Chironominae) and the spring pool, which shares both taxa. Chironomids were paid special attention for being the most abundant. A DCA analysis based on the relative abundance of Chironomids reveals salinity as the main characteristic responsible for its distribution, but trophic state and hydrological regime were also shown to be important factors.
Resumo:
Presentamos el proyecto CLARIN, un proyecto cuyo objetivo es potenciar el uso de instrumentos tecnológicos en la investigación en las Humanidades y Ciencias Sociales
Resumo:
Internationalization and the following rapid growth have created the need to concentrate the IT systems of many small-to-medium-sized production companies. Enterprise Resource Planning systems are a common solution for such companies. Deployment of these ERP systems consists of many steps, one of which is the implementation of the same shared system at all international subsidiaries. This is also one of the most important steps in the internationalization strategy of the company from the IT point of view. The mechanical process of creating the required connections for the off-shore sites is the easiest and most well-documented step along the way, but the actual value of the system, once operational, is perceived in its operational reliability. The operational reliability of an ERP system is a combination of many factors. These factors vary from hardware- and connectivity-related issues to administrative tasks and communication between decentralized administrative units and sites. To accurately analyze the operational reliability of such system, one must take into consideration the full functionality of the system. This includes not only the mechanical and systematic processes but also the users and their administration. All operational reliability in an international environment relies heavily on hardware and telecommunication adequacy so it is imperative to have resources dimensioned with regard to planned usage. Still with poorly maintained communication/administration schemes no amount of bandwidth or memory will be enough to maintain a productive level of reliability. This thesis work analyzes the implementation of a shared ERP system to an international subsidiary of a Finnish production company. The system is Microsoft Dynamics Ax, currently being introduced to a Slovakian facility, a subsidiary of Peikko Finland Oy. The primary task is to create a feasible base of analysis against which the operational reliability of the system can be evaluated precisely. With a solid analysis the aim is to give recommendations on how future implementations are to be managed.