824 resultados para decentralised data fusion framework
Resumo:
Background There is no evidence to date on whether transcriptional regulators are able to shift the balance between mitochondrial fusion and fission events through selective control of gene expression. Methodology/Principal Findings Here, we demonstrate that reduced mitochondrial size observed in knock-out mice for the transcriptional regulator PGC-1β is associated with a selective reduction in Mitofusin 2 (Mfn2) expression, a mitochondrial fusion protein. This decrease in Mfn2 is specific since expression of the remaining components of mitochondrial fusion and fission machinery were not affected. Furthermore, PGC-1β increases mitochondrial fusion and elongates mitochondrial tubules. This PGC-1β-induced elongation specifically requires Mfn2 as this process is absent in Mfn2-ablated cells. Finally, we show that PGC-1β increases Mfn2 promoter activity and transcription by coactivating the nuclear receptor Estrogen Related Receptor α (ERRα). Conclusions/Significance Taken together, our data reveal a novel mechanism by which mammalian cells control mitochondrial fusion. In addition, we describe a novel role of PGC-1β in mitochondrial physiology, namely the control of mitochondrial fusion mainly through Mfn2.
Resumo:
Most local agencies in Iowa currently make their pavement treatment decisions based on their limited experience due primarily to lack of a systematic decision-making framework and a decision-aid tool. The lack of objective condition assessment data of agency pavements also contributes to this problem. This study developed a systematic pavement treatment selection framework for local agencies to assist them in selecting the most appropriate treatment and to help justify their maintenance and rehabilitation decisions. The framework is based on an extensive literature review of the various pavement treatment techniques in terms of their technical applicability and limitations, meaningful practices of neighboring states, and the results of a survey of local agencies. The treatment selection framework involves three different steps: pavement condition assessment, selection of technically feasible treatments using decision trees, and selection of the most appropriate treatment considering the return-on-investment (ROI) and other non-economic factors. An Excel-based spreadsheet tool that automates the treatment selection framework was also developed, along with a standalone user guide for the tool. The Pavement Treatment Selection Tool (PTST) for Local Agencies allows users to enter the severity and extent levels of existing distresses and then, recommends a set of technically feasible treatments. The tool also evaluates the ROI of each feasible treatment and, if necessary, it can also evaluate the non-economic value of each treatment option to help determine the most appropriate treatment for the pavement. It is expected that the framework and tool will help local agencies improve their pavement asset management practices significantly and make better economic and defensible decisions on pavement treatment selection.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
There are a number of morphological analysers for Polish. Most of these, however, are non-free resources. What is more, different analysers employ different tagsets and tokenisation strategies. This situation calls for a simpleand universal framework to join different sources of morphological information, including the existing resources as well as user-provided dictionaries. We present such a configurable framework that allows to write simple configuration files that define tokenisation strategies and the behaviour of morphologicalanalysers, including simple tagset conversion.
Resumo:
BACKGROUND: Selective publication of studies, which is commonly called publication bias, is widely recognized. Over the years a new nomenclature for other types of bias related to non-publication or distortion related to the dissemination of research findings has been developed. However, several of these different biases are often still summarized by the term 'publication bias'. METHODS/DESIGN: As part of the OPEN Project (To Overcome failure to Publish nEgative fiNdings) we will conduct a systematic review with the following objectives:- To systematically review highly cited articles that focus on non-publication of studies and to present the various definitions of biases related to the dissemination of research findings contained in the articles identified.- To develop and discuss a new framework on nomenclature of various aspects of distortion in the dissemination process that leads to public availability of research findings in an international group of experts in the context of the OPEN Project.We will systematically search Web of Knowledge for highly cited articles that provide a definition of biases related to the dissemination of research findings. A specifically designed data extraction form will be developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article.For the development of a new framework we will construct an initial table listing different levels and different hazards en route to making research findings public. An international group of experts will iteratively review the table and reflect on its content until no new insights emerge and consensus has been reached. DISCUSSION: Results are expected to be publicly available in mid-2013. This systematic review together with the results of other systematic reviews of the OPEN project will serve as a basis for the development of future policies and guidelines regarding the assessment and prevention of publication bias.
Resumo:
Fenix on suuri metsäteollisuuden tuotannonohjausjärjestelmä. Fenix-järjestelmän raportointi- ja tulostuspalvelut elävätvaihdekautta. Aikaisemmin käytetyt raportointityökalut ovat vanhentumassa ja neon korvattava uusilla. Uusi raportointialusta, Global Printing System (GPS), onrakennettu StreamServe Business Communication Platformin ympärille. Uuden alustan on tarkoitus hoitaa Fenixin tulostus sekä raportointitehtävät. Työ kuvaa raportointialustan toteutuksen sekä sen tärkeimmät ominaisuudet. Uuden alustan suorituskyvyssä on ollut toivomisen varaa. Etenkin suurien raporttien generoiminen on kestänyt joskus toivottoman pitkään. Työssä analysoidaan raportointialustan suorituskykyä ja etsitään mahdolliset pullonkaulat. Suorituskyvyn heikkouksiin pyritään löytämään ratkaisut ja annetaan ehdotuksia suorituskyvyn parantamiseksi. XML pohjaisena järjestelmänä GPS:n suorituskyvyssä suurta osaaesittää XML:n tehokkuus. GPS sisääntuleva data tulee XML-muodossa ja sisääntulon parsimisen tehokkuus on avaintekijöitä koko GPS:n tehokkuuden kannalta. Suorituskyvyn parantamisessa keskitytäänki vahvasti XML:n tehokkaampaan käyttöön ja esitetään ehdotuksia sen parantamiseksi.
Resumo:
Avoimesta innovaatiosta ja innovaatioiden tehokkaasta hyödyntämisestä on tulossa tärkeitä osia yritysten T&K-prosesseihin. Diplomityön tarkoituksena on luoda viitekehys teknologioiden, jotka eivät kuulu yrityksen ydinliiketoimintaan, tehokkaampaan hallinnointiin tutkimusorganisaatiossa. Konstruktiivinen viitekehys on rakennettu pohjautuen aineettomien pääomien johtamisen ja portfolion hallinnoinnin teorioihin. Lisäksi työssä määritellään työkaluja jatekniikoita ylijäämäteknologioiden arviointiin. Uutta ylijäämäteknologioiden portfoliota voidaan hyödyntää hakukoneena, ideapankkina, kommunikaatiotyökaluna tai teknologioiden markkinapaikkana. Sen johtaminen koostuu tietojen dokumentoinnista järjestelmään, teknologioiden arvioinnista ja portfolion päivityksestä ja ylläpidosta.
Resumo:
Forensic intelligence is a distinct dimension of forensic science. Forensic intelligence processes have mostly been developed to address either a specific type of trace or a specific problem. Even though these empirical developments have led to successes, they are trace-specific in nature and contribute to the generation of silos which hamper the establishment of a more general and transversal model. Forensic intelligence has shown some important perspectives but more general developments are required to address persistent challenges. This will ensure the progress of the discipline as well as its widespread implementation in the future. This paper demonstrates that the description of forensic intelligence processes, their architectures, and the methods for building them can, at a certain level, be abstracted from the type of traces considered. A comparative analysis is made between two forensic intelligence approaches developed independently in Australia and in Europe regarding the monitoring of apparently very different kind of problems: illicit drugs and false identity documents. An inductive effort is pursued to identify similarities and to outline a general model. Besides breaking barriers between apparently separate fields of study in forensic science and intelligence, this transversal model would assist in defining forensic intelligence, its role and place in policing, and in identifying its contributions and limitations. The model will facilitate the paradigm shift from the current case-by-case reactive attitude towards a proactive approach by serving as a guideline for the use of forensic case data in an intelligence-led perspective. A follow-up article will specifically address issues related to comparison processes, decision points and organisational issues regarding forensic intelligence (part II).
Resumo:
PURPOSE: To improve the risk stratification of patients with rhabdomyosarcoma (RMS) through the use of clinical and molecular biologic data. PATIENTS AND METHODS: Two independent data sets of gene-expression profiling for 124 and 101 patients with RMS were used to derive prognostic gene signatures by using a meta-analysis. These and a previously published metagene signature were evaluated by using cross validation analyses. A combined clinical and molecular risk-stratification scheme that incorporated the PAX3/FOXO1 fusion gene status was derived from 287 patients with RMS and evaluated. RESULTS: We showed that our prognostic gene-expression signature and the one previously published performed well with reproducible and significant effects. However, their effect was reduced when cross validated or tested in independent data and did not add new prognostic information over the fusion gene status, which is simpler to assay. Among nonmetastatic patients, patients who were PAX3/FOXO1 positive had a significantly poorer outcome compared with both alveolar-negative and PAX7/FOXO1-positive patients. Furthermore, a new clinicomolecular risk score that incorporated fusion gene status (negative and PAX3/FOXO1 and PAX7/FOXO1 positive), Intergroup Rhabdomyosarcoma Study TNM stage, and age showed a significant increase in performance over the current risk-stratification scheme. CONCLUSION: Gene signatures can improve current stratification of patients with RMS but will require complex assays to be developed and extensive validation before clinical application. A significant majority of their prognostic value was encapsulated by the fusion gene status. A continuous risk score derived from the combination of clinical parameters with the presence or absence of PAX3/FOXO1 represents a robust approach to improving current risk-adapted therapy for RMS.
Resumo:
In recent years, Semantic Web (SW) research has resulted in significant outcomes. Various industries have adopted SW technologies, while the ‘deep web’ is still pursuing the critical transformation point, in which the majority of data found on the deep web will be exploited through SW value layers. In this article we analyse the SW applications from a ‘market’ perspective. We are setting the key requirements for real-world information systems that are SW-enabled and we discuss the major difficulties for the SW uptake that has been delayed. This article contributes to the literature of SW and knowledge management providing a context for discourse towards best practices on SW-based information systems.
Resumo:
Tässä diplomityössä on oletettu että neljännen sukupolven mobiiliverkko on saumaton yhdistelmä olemassa olevia toisen ja kolmannen sukupolven langattomia verkkoja sekä lyhyen kantaman WLAN- ja Bluetooth-radiotekniikoita. Näiden tekniikoiden on myös oletettu olevan niin yhteensopivia ettei käyttäjä havaitse saanti verkon muuttumista. Työ esittelee neljännen sukupolven mobiiliverkkoihin liittyvien tärkeimpien langattomien tekniikoiden arkkitehtuurin ja perustoiminta-periaatteet. Työ kuvaa eri tekniikoita ja käytäntöjä tiedon mittaamiseen ja keräämiseen. Saatuja transaktiomittauksia voidaan käyttää tarjottaessa erilaistettuja palvelutasoja sekä verkko- ja palvelukapasiteetin optimoimisessa. Lisäksi työssä esitellään Internet Business Information Manager joka on ohjelmistokehys hajautetun tiedon keräämiseen. Sen keräämää mittaustietoa voidaan käyttää palvelun tason seurannassa j a raportoinnissa sekä laskutuksessa. Työn käytännön osuudessa piti kehittää langattoman verkon liikennettä seuraava agentti joka tarkkailisi palvelun laatua. Agentti sijaitsisi matkapuhelimessa mitaten verkon liikennettä. Agenttia ei kuitenkaan voitu toteuttaa koska ohjelmistoympäristö todettiin vajaaksi. Joka tapauksessa työ osoitti että käyttäjän näkökulmasta tietoa kerääville agenteille on todellinen tarve.
Resumo:
Tämän diplomityön päämääränä oli kuvata tilaus-toimitusprosessin eri toimintojen työnkulku, kun tuotetiedonhallintajärjestelmä on osa työympäristöä. Työn teoreettisessa osassa tarkasteltiin liiketoimintaprosessien uudistamista ja prosessien määrittämistä sekä esiteltiin tuotetiedonhallinnan (PDM) keskeiset osa-alueet. Kohdeyrityksen tausta ja strategiat esiteltiin, minkä jälkeen muutoksia arvioitiin suhteessa teoriaosuuden tuloksiin. Nykyisten toimintatapojen määrittämistä varten haastateltiin henkilöitä jokaisesta tilaus-toimitusprosessin vaiheesta tuotantoyksikön sisällä. Lopuksi kuvattiin yrityksen tuotetiedonhallintaperiaatteet ja määritettiin työnkulku prosessin eri vaiheissa. Samalla kuin uusi tuotetiedonhallintajärjestelmä otetaan käyttöön, on yrityksessä omaksuttava tuotetiedonhallinnan ajatusmalli. Tuoterakenteen hallinta jakautuu nyt eri toimintojen kesken, jolloin suunnittelun rakenne, tuotannon rakenne ja huoltorakenne ovat eri ihmisten vastuulla. Näiden eri rakenteiden konfigurointi tilaus-toimitus prosessin aikana määrää missä järjestyksessä toiminnot on suoritettava eri järjestelmien välillä. Monikansallinen suunnitteluorganisaatio on myös otettava huomioon tilauksenkulun aikana. Tuotetiedonhallintajärjestelmää käytetään yhdessä tuttujen suunnitteluohjelmien sekä toiminnanohjausjärjestelmän (ERP) kanssa. Työnkulkukaaviossa määritellään koko yritystä koskeva malli siitä, miten ja missä järjestyksessä tehtävät on suoritettava eri järjestelmissä tilaus-toimitus prosessin aikana. Tässä työssä tutkittiin tuotteen määrittelyn ja suunnittelutiedon hallinnan kannalta oleellisimmat tilaus-toimitusprosessiin kuuluvat toiminnot; myynti, myynnin tuki, tuotannon ohjaus, sovellussuunnittelu ja dokumentointi. Tulevaisuudessa on suositeltavaa pohtia tuotetiedonhallintajärjestelmän käyttöönottoa myös tuotannossa ja ostoissa. Tilaus-toimitusprosessiin liittyvät kehitysmahdollisuudet kannattaisi seuraavaksi kohdistaa tilauksen määrittelyvaiheeseen myyjä-asiakas rajapinnassa, jossa tehdyt virheet kertautuvat jokaisessa prosessin vaiheessa.
Resumo:
The globalization and development of an information society promptly change shape of the modern world. Cities and especially megacities including Saint-Petersburg are in the center of occuring changes. As a result of these changes the economic activities connected to reception and processing of the information now play very important role in economy of megacities what allows to characterize them as "information". Despite of wide experience in decision of information questions Russia, and in particular Saint-Petersburg, lag behind in development of information systems from the advanced European countries. The given master's thesis is devoted to development of an information system (data transmission network) on the basis of wireless technology in territory of Saint-Petersburg region within the framework of FTOP "Electronic Russia" and RTOP "Electronic Saint-Petersburg" programs. Logically the master's thesis can be divided into 3 parts: 1. The problems, purposes, expected results, terms and implementation of the "Electronic Russia" program. 2. Discussion about wireless data transmission networks (description of technology, substantiation of choice, description of signal's transmission techniques and types of network topology). 3. Fulfillment of the network (organization of central network node, regional centers, access lines, description of used equipment, network's capabilities), financial provision of the project, possible network management models.
Resumo:
BACKGROUND: Frequent emergency department (ED) users meet several of the criteria of vulnerability, but this needs to be further examined taking into consideration all vulnerability's different dimensions. This study aimed to characterize frequent ED users and to define risk factors of frequent ED use within a universal health care coverage system, applying a conceptual framework of vulnerability. METHODS: A controlled, cross-sectional study comparing frequent ED users to a control group of non-frequent users was conducted at the Lausanne University Hospital, Switzerland. Frequent users were defined as patients with five or more visits to the ED in the previous 12 months. The two groups were compared using validated scales for each one of the five dimensions of an innovative conceptual framework: socio-demographic characteristics; somatic, mental, and risk-behavior indicators; and use of health care services. Independent t-tests, Wilcoxon rank-sum tests, Pearson's Chi-squared test and Fisher's exact test were used for the comparison. To examine the -related to vulnerability- risk factors for being a frequent ED user, univariate and multivariate logistic regression models were used. RESULTS: We compared 226 frequent users and 173 controls. Frequent users had more vulnerabilities in all five dimensions of the conceptual framework. They were younger, and more often immigrants from low/middle-income countries or unemployed, had more somatic and psychiatric comorbidities, were more often tobacco users, and had more primary care physician (PCP) visits. The most significant frequent ED use risk factors were a history of more than three hospital admissions in the previous 12 months (adj OR:23.2, 95%CI = 9.1-59.2), the absence of a PCP (adj OR:8.4, 95%CI = 2.1-32.7), living less than 5 km from an ED (adj OR:4.4, 95%CI = 2.1-9.0), and household income lower than USD 2,800/month (adj OR:4.3, 95%CI = 2.0-9.2). CONCLUSIONS: Frequent ED users within a universal health coverage system form a highly vulnerable population, when taking into account all five dimensions of a conceptual framework of vulnerability. The predictive factors identified could be useful in the early detection of future frequent users, in order to address their specific needs and decrease vulnerability, a key priority for health care policy makers. Application of the conceptual framework in future research is warranted.