982 resultados para Data Center, Software Defined Networking, SDN


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In this study, we evaluated the ability of gene expression profiles to predict chemotherapy response and survival in triple-negative breast cancer (TNBC). METHODS Gene expression and clinical-pathological data were evaluated in five independent cohorts, including three randomised clinical trials for a total of 1055 patients with TNBC, basal-like disease (BLBC) or both. Previously defined intrinsic molecular subtype and a proliferation signature were determined and tested. Each signature was tested using multivariable logistic regression models (for pCR (pathological complete response)) and Cox models (for survival). Within TNBC, interactions between each signature and the basal-like subtype (vs other subtypes) for predicting either pCR or survival were investigated. RESULTS Within TNBC, all intrinsic subtypes were identified but BLBC predominated (55-81%). Significant associations between genomic signatures and response and survival after chemotherapy were only identified within BLBC and not within TNBC as a whole. In particular, high expression of a previously identified proliferation signature, or low expression of the luminal A signature, was found independently associated with pCR and improved survival following chemotherapy across different cohorts. Significant interaction tests were only obtained between each signature and the BLBC subtype for prediction of chemotherapy response or survival. CONCLUSIONS The proliferation signature predicts response and improved survival after chemotherapy, but only within BLBC. This highlights the clinical implications of TNBC heterogeneity, and suggests that future clinical trials focused on this phenotypic subtype should consider stratifying patients as having BLBC or not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

since 1999 data from pulmonary hypertension (PH) patients from all PH centres in Switzerland were prospectively collected. We analyse the epidemiological aspects of these data. PH was defined as a mean pulmonary artery pressure of >25 mm Hg at rest or >30 mm Hg during exercise. Patients with pulmonary arterial hypertension (PAH), PH associated with lung diseases, PH due to chronic thrombotic and/or embolic disease (CTEPH), or PH due to miscellaneous disorders were registered. Data from adult patients included between January 1999 and December 2004 were analysed. 250 patients were registered (age 58 +/- 16 years, 104 (41%) males). 152 patients (61%) had PAH, 73 (29%) had CTEPH and 18 (7%) had PH associated with lung disease. Patients <50 years (32%) were more likely to have PAH than patients >50 years (76% vs. 53%, p <0.005). Twenty-four patients (10%) were lost to followup, 58 patients (26%) died and 150 (66%) survived without transplantation or thrombendarterectomy. Survivors differed from patients who died in the baseline six-minute walking distance (400 m [300-459] vs. 273 m [174-415]), the functional impairment (NYHA class III/IV 86% vs. 98%), mixed venous saturation (63% [57-68] vs. 56% [50-61]) and right atrial pressure (7 mm Hg [4-11] vs. 11 mm Hg [4-18]). PH is a disease affecting adults of all ages. The management of these patients in specialised centres guarantees a high quality of care. Analysis of the registry data could be an instrument for quality control and might help identify weak points in assessment and treatment of these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This User’s Guide serves as a reference for field personnel using the sign inventory data collection software tool. This tool was developed to simplify and standardize the collection and updating of sign inventory information. The software and collection methodology was developed by the Iowa DOT Sign Management Task Force and the Center for Transportation Research and Education at Iowa State University. Required Equipment -The data collection process requires both a portable computer and a global positioning system (GPS) device (connected via USB cable). Since computer battery performance varies, a DC power converter is recommended. A check-in/out process has also been established which allows updates to sign information from the central database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudi de viabilitat sobre la implantació d'un software-defined storage open source en entorns empresarials. Comparativa entre Gluster, Ceph, OpenAFS, TahoeFS i XtreemFS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli selvittää, onko tutkielman tilaajan toteuttaman kannattavuusraportoinnin laatu käyttäjien mielestä riittävä. Kannattavuusraportointi on toteutettu data warehouse tekniikalla. Tutkielman tavoitteina oli myös määrittää, mitä ohjelmiston laatu tarkoittaa ja miten sitä voidaan arvioida. Tutkimuksessa käytettiin kvalitatiivista tutkimusmenetelmää. Laadun arviointiin käytetty aineisto kerättiin haastattelemalla seitsemäätoista kannattavuusraportoinnin aktiivikäyttäjää. Tutkielmassa ohjelmiston laatu tarkoittaa sen kykyä täyttää tai ylittää käyttäjiensä kohtuulliset toiveet ja odotukset. Laatua arvioitiin standardin ISO/IEC 9126 määrittelemällä kuudella laatuominaisuudella, jotka kuvaavat minimaalisella päällekkäisyydellä ohjelmiston laadun. Lisäksi arvioinnissa hyödynnettiin varsinaiseen standardiin kuulumatonta informatiivista liitettä, joka tarkentaa ISO/IEC 9126 standardissa esitettyjä laadun ominaispiirteitä. Tutkimuksen tuloksena voidaan todeta, että käyttäjien mukaan kannattavuusraportointi on tarpeeksi laadukas, sillä se pystyy tarjoamaan helppokäyttöisiä, oikeanmuotoisia raportteja riittävän hyvällä vasteajalla käyttäjien tarpeisiin. Tehokkaasta hyödyntämisestä voidaan päätellä data warehousen rakentamisen onnistuneen. Tutkimuksessa nousi esiin myös runsaasti kehittämis- ja parannusideoita, jotka toimivat yhtenä kehitystyön apuvälineenä tulevaisuudessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), intervals Principal Component Analysis (iPCA), as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis is to investigate projects funded in European 7th framework Information and Communication Technology- work programme. The research has been limited to issue ”Pervasive and trusted network and service infrastructure” and the aim is to find out which are the most important topics into which research will concentrate in the future. The thesis will provide important information for the Department of Information Technology in Lappeenranta University of Technology. First in this thesis will be investigated what are the requirements for the projects which were funded in “Pervasive and trusted network and service infrastructure” – programme 2007. Second the projects funded according to “Pervasive and trusted network and service infrastructure”-programme will be listed in to tables and the most important keywords will be gathered. Finally according to the keyword appearances the vision of the most important future topics will be defined. According to keyword-analysis the wireless networks are in important role in the future and core networks will be implemented with fiber technology to ensure fast data transfer. Software development favors Service Oriented Architecture (SOA) and open source solutions. The interoperability and ensuring the privacy are in key role in the future. 3D in all forms and content delivery are important topics as well. When all the projects were compared, the most important issue was discovered to be SOA which leads the way to cloud computing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing enables on-demand network access to shared resources (e.g., computation, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort. Cloud computing refers to both the applications delivered as services over the Internet and the hardware and system software in the data centers. Software as a service (SaaS) is part of cloud computing. It is one of the cloud service models. SaaS is software deployed as a hosted service and accessed over the Internet. In SaaS, the consumer uses the provider‘s applications running in the cloud. SaaS separates the possession and ownership of software from its use. The applications can be accessed from any device through a thin client interface. A typical SaaS application is used with a web browser based on monthly pricing. In this thesis, the characteristics of cloud computing and SaaS are presented. Also, a few implementation platforms for SaaS are discussed. Then, four different SaaS implementation cases and one transformation case are deliberated. The pros and cons of SaaS are studied. This is done based on literature references and analysis of the SaaS implementations and the transformation case. The analysis is done both from the customer‘s and service provider‘s point of view. In addition, the pros and cons of on-premises software are listed. The purpose of this thesis is to find when SaaS should be utilized and when it is better to choose a traditional on-premises software. The qualities of SaaS bring many benefits both for the customer as well as the provider. A customer should utilize SaaS when it provides cost savings, ease, and scalability over on-premises software. SaaS is reasonable when the customer does not need tailoring, but he only needs a simple, general-purpose service, and the application supports customer‘s core business. A provider should utilize SaaS when it offers cost savings, scalability, faster development, and wider customer base over on-premises software. It is wise to choose SaaS when the application is cheap, aimed at mass market, needs frequent updating, needs high performance computing, needs storing large amounts of data, or there is some other direct value from the cloud infrastructure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työssä tutkitaan tiedonsiirtoa eri modulaatioilla, bittinopeuksilla ja amplitudin voimakkuuksilla ja tuloksia tarkastellaan Bit Error Ration avulla. Signaaleja siirrettiiin myös koodattuna ja vertailtiin koodauksen etuja ja haittoja verrattuna koodaamattomaan tietoon. Datavirta kulkee AXMK-kaapelissa, joko tasasähkön mukana, tai maadoituskaapelissa. Tuloksissa havaittiin, että suurempi bittinopeus ei kasvattanut häviöiden määrää. Koodauksen käyttö toisaalta vähenti bittivirheiden määrää.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les seize détecteurs MPX constituant le réseau ATLAS-MPX ont été placés à différentes positions dans le détecteur ATLAS et sa averne au CERN dans le but de mesurer en emps réel les champs de radiation produits ar des particules primaires (protons des faisceaux) et des particules secondaires (kaons, pions, g, protons) issues des collisions proton-proton. Des films de polyéthylène (PE) et de fluorure de lithium (6LiF) recouvrent les détecteurs afin d’augmenter leur sensibilité aux neutrons produits par les particules primaires et secondaires interagissant avec les matériaux présents dans l’environnement d’ATLAS. La reconnaissance des traces laissées par les particules dans un détecteur ATLAS-MPX se fait à partir des algorithmes du logiciel MAFalda (“Medipix Analysis Framework”) basé sur les librairies et le logiciel d’analyse de données ROOT. Une étude sur le taux d’identifications erronées et le chevauchement d’amas a été faite en reconstruisant les activités des sources 106Ru et 137Cs. L’efficacité de détection des neutrons rapides a été mesurée à l’aide des sources 252Cf et 241AmBe (neutrons d’énergie moyenne de 2.13 et 4.08 MeV respectivement). La moyenne des efficacités de détection mesurées pour les neutrons produits par les sources 252C f et 241AmBe a été calculée pour les convertisseurs 6LiF et PE et donnent (0.8580 ± 0.1490)% et (0.0254 ± 0.0031)% pour LiF et (0.0510 ± 0.0061)% et (0.0591 ± 0.0063)% pour PE à bas et à haut seuil d’énergie respectivement. Une simulation du calcul de l’efficacité de détection des neutrons dans le détecteur MPX a été réalisée avec le logiciel GEANT4. Des données MPX correspondant aux collisions proton-proton à 2.4 TeV et à 7 TeV dans le centre de masse ont été analysées. Les flux détectés d’électrons et de photons sont particulièrement élevés dans les détecteurs MPX01 et MPX14 car ils sont plus près du point de collision. Des flux de neutrons ont été estimés en utilisant les efficacités de détection mesurées. Une corrélation avec la luminosité du LHC a été établie et on prédit que pour les collisions à 14 TeV dans le centre de masse et avec une luminosité de 10^34 cm-1*s-1 il y aura environ 5.1x10^8 ± 1.5x10^7 et 1.6x10^9 ± 6.3x10^7 particules détectées par les détecteurs MPX01 et MPX14 respectivement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Defined Radio (SDR) hardware platforms use parallel architectures. Current concepts of developing applications (such as WLAN) for these platforms are complex, because developers describe an application with hardware-specifics that are relevant to parallelism such as mapping and scheduling. To reduce this complexity, we have developed a new programming approach for SDR applications, called Virtual Radio Engine (VRE). VRE defines a language for describing applications, and a tool chain that consists of a compiler kernel and other tools (such as a code generator) to generate executables. The thesis presents this concept, as well as describes the language and the compiler kernel that have been developed by the author. The language is hardware-independent, i.e., developers describe tasks and dependencies between them. The compiler kernel performs automatic parallelization, i.e., it is capable of transforming a hardware-independent program into a hardware-specific program by solving hardware-specifics, in particular mapping, scheduling and synchronizations. Thus, VRE simplifies programming tasks as developers do not solve hardware-specifics manually.