941 resultados para open source seismic data processing packages
Resumo:
La pérdida de autonomía a edades avanzadas no se asocia únicamente con el envejecimiento sino también con características del entorno físico y social. Investigaciones recientes han demostrado que la red social, la integración social y la participación, actúan como predictores de la discapacidad en la vejez. El objetivo de este trabajo es nalizar el efecto de la red social sobre el nivel de autonomía(en términos de discapacidad instrumental y básica) en etapas iniciales de la vejez.
Resumo:
World Wide Webin suosiolla on ollut merkittävä vaikutus yhteiskuntaan. WWW-sivut ovat helposti saatavilla ja sisällön tekeminen WWW:hen on helppoa. WWW-ympäristölle myös kehitetään paljon sovelluksia. WWW-sovellusten kehittämiselle ominaista on valinnanvapaus ja nopeuden tavoittelu. WWW-sovellusten ohjelmoinnin mahdollistavat useat toisilleen vaihtoehtoiset tekniikat. Ne eroavat toisistaan suoritusnopeudessa, ominaisuuksien määrässä ja joustavuudessa. Ohjelmoinnissa käytetään apuna useita erilaisia menetelmiä. Apumenetelmiä ovat muun muassa työkalut ja valmiiden komponenttien hyödyntäminen. Valmiit komponentit voivat olla joko ilmaisia, avointa lähdekoodia tai maksullisia. Tämän kandidaatintyön aikana valmistui sovellus, joka piirtää tilastotiedoista kaaviokuvia ja näyttää niitä dynaamisella WWW-sivulla. Sovellus pyrittiin toteuttamaan älykkäästi apumenetelmiä sopivasti hyödyntäen. Sovelluksen kehittämisessä käytettiin apuna sekä ohjelmointityökaluja että valmiita komponentteja. Kaaviokuvien tyypin ja ulkoasun haluttiin olevan käyttäjien muokattavissa. Toisaalta sovelluksen haluttiin olevan helposti laajennettavissa. Vaatimuksiin vastattiin tekemällä kaaviokuvien piirrosta osittain tietokannalla ohjelmoitava.
Resumo:
Aquest projecte s'emmarca dins el màster en Programari lliure de la UOC, en l'especialitat de xarxes i sistemes operatius. En ell he col·laborat amb CETRAMSA, empresa dedicada a la informació i difusió al ciutadà de l'oferta de transport públic en l'àmbit metropolità de Barcelona, per dur a terme la migració de la seva infraestructura de servidors a programari lliure.
Resumo:
Advances in flow cytometry and other single-cell technologies have enabled high-dimensional, high-throughput measurements of individual cells as well as the interrogation of cell population heterogeneity. However, in many instances, computational tools to analyze the wealth of data generated by these technologies are lacking. Here, we present a computational framework for unbiased combinatorial polyfunctionality analysis of antigen-specific T-cell subsets (COMPASS). COMPASS uses a Bayesian hierarchical framework to model all observed cell subsets and select those most likely to have antigen-specific responses. Cell-subset responses are quantified by posterior probabilities, and human subject-level responses are quantified by two summary statistics that describe the quality of an individual's polyfunctional response and can be correlated directly with clinical outcome. Using three clinical data sets of cytokine production, we demonstrate how COMPASS improves characterization of antigen-specific T cells and reveals cellular 'correlates of protection/immunity' in the RV144 HIV vaccine efficacy trial that are missed by other methods. COMPASS is available as open-source software.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Tutkimuksen päätavoitteena oli atk-avusteisen tilintarkastuksen käytön tutkiminen. Tutkimus jakaantuu teoreettiseen ja empiiriseen osaan. Teoriaosuudessa käydään läpi tilintarkastusprosessia ja esitellään tietokoneavusteisen tilintarkastuksen työvälineitä sekä arvioidaan kirjallisuuden ja muun lähdeaineiston perusteella atk:n tuottamia hyötyjä ja aiheuttamia riskejä. Empiriaosuudessa tutkittiin tilintarkastajille suunnatun kyselytutkimuksen avulla miten laajaa atk:n hyväksikäyttö on tilintarkastusmaailmassa ja miten tilintarkastajat itse näkevät sen tuomat hyödyt ja haitat sekä atk-avusteisen tilintarkastuksen kehittymisen lähitulevaisuudessa. Tutkimustuloksia verrataan aikaisemmin samasta aihepiiristä tehtyjen tutkimusten tuloksiin. Tutkimustuloksia verrattaessa käy ilmi, että tietokoneen käyttö ja hyödyntäminen tilintarkastustyössä on selvästi lisääntynyt. On huomattava, että atk:n mukaantulo tilintarkastustoimintaan tuo mukanaan ongelmia, jotka tulee tiedostaa, mutta atk:n tuottamien lisäetujen määrä on niin huomattava, että tulevaisuudessa tehokas tilintarkastustyö ei onnistu ilman atk-avusteisia menetelmiä.
Resumo:
The second scientific meeting of the European systems genetics network for the study of complex genetic human disease using genetic reference populations (SYSGENET) took place at the Center for Cooperative Research in Biosciences in Bilbao, Spain, December 10-12, 2012. SYSGENET is funded by the European Cooperation in the Field of Scientific and Technological Research (COST) and represents a network of scientists in Europe that use mouse genetic reference populations (GRPs) to identify complex genetic factors influencing disease phenotypes (Schughart, Mamm Genome 21:331-336, 2010). About 50 researchers working in the field of systems genetics attended the meeting, which consisted of 27 oral presentations, a poster session, and a management committee meeting. Participants exchanged results, set up future collaborations, and shared phenotyping and data analysis methodologies. This meeting was particularly instrumental for conveying the current status of the US, Israeli, and Australian Collaborative Cross (CC) mouse GRP. The CC is an open source project initiated nearly a decade ago by members of the Complex Trait Consortium to aid the mapping of multigenetic traits (Threadgill, Mamm Genome 13:175-178, 2002). In addition, representatives of the International Mouse Phenotyping Consortium were invited to exchange ongoing activities between the knockout and complex genetics communities and to discuss and explore potential fields for future interactions.
Resumo:
PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.
Resumo:
A detailed analysis of the morphology and the Holocene seismic and sequence stratigraphy and architecture of the infralittoral sedimentary environment of the El Masnou coast (Catalonia, NW Mediterranean Sea) was carried out using multibeam bathymetry and GeoPulse seismic data. This environment extends down to 26-30 m water depth, and is defined morphologically by two depositional wedges whose seafloor is affected by erosive furrows, slides, fields of large- and small-scale wavy bedforms, and dredging trenches and pits. Erosive terraces are also identified in the transition domain toward the inner continental shelf. The Holocene stratigraphy of the infralittoral environment is defined by two major seismic sequences (lower and upper), each one formed by internal seismic units. The sequences and units are characterised by downlapping surfaces made up of deposits formed by progradation of coastal lithosomes. The stratigraphy and stratal architecture, displaying a retrogradational arrangement with progradational patterns of minor order, were controlled by different sea-level positions. The stratigraphic division represents the coastal response to the last fourth-order transgressive and highstand conditions, modulated by small-scale sea-level oscillations (≈1-2 m) of fith to sixth order. This study also highlights the advantage of an integrated analysis using acoustic/seismic methods for practical assessment of the anthropogenic effects on infralittoral domains based on the association of marine geological observations.
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
En aquest projecte es vol explorar en el mercat per trobar una bona solució open source de business intelligence que permeti als dirigents d'un club de fitness millorar la gestió dels seus centres i respondre's algunes preguntes que s'han començat a fer sobre el funcionament del seu negoci, el qual intueixen que ha patit un retrocés de beneficis i de confiança dels seus socis. La finalitat del treball ha estat crear un data warehouse que s'ajustés a les dades de què disposen, transformar-les mitjançant processos ETL i crear cubs OLAP per explotar-les amb eficàcia des de la plataforma de BI escollida.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
Web application performance testing is an emerging and important field of software engineering. As web applications become more commonplace and complex, the need for performance testing will only increase. This paper discusses common concepts, practices and tools that lie at the heart of web application performance testing. A pragmatic, hands-on approach is assumed where applicable; real-life examples of test tooling, execution and analysis are presented right next to the underpinning theory. At the client-side, web application performance is primarily driven by the amount of data transmitted over the wire. At the server-side, selection of programming language and platform, implementation complexity and configuration are the primary contributors to web application performance. Web application performance testing is an activity that requires delicate coordination between project stakeholders, developers, system administrators and testers in order to produce reliable and useful results. Proper test definition, execution, reporting and repeatable test results are of utmost importance. Open-source performance analysis tools such as Apache JMeter, Firebug and YSlow can be used to realise effective web application performance tests. A sample case study using these tools is presented in this paper. The sample application was found to perform poorly even under the moderate load incurred by the sample tests.
Resumo:
The purpose of this thesis is to investigate projects funded in European 7th framework Information and Communication Technology- work programme. The research has been limited to issue ”Pervasive and trusted network and service infrastructure” and the aim is to find out which are the most important topics into which research will concentrate in the future. The thesis will provide important information for the Department of Information Technology in Lappeenranta University of Technology. First in this thesis will be investigated what are the requirements for the projects which were funded in “Pervasive and trusted network and service infrastructure” – programme 2007. Second the projects funded according to “Pervasive and trusted network and service infrastructure”-programme will be listed in to tables and the most important keywords will be gathered. Finally according to the keyword appearances the vision of the most important future topics will be defined. According to keyword-analysis the wireless networks are in important role in the future and core networks will be implemented with fiber technology to ensure fast data transfer. Software development favors Service Oriented Architecture (SOA) and open source solutions. The interoperability and ensuring the privacy are in key role in the future. 3D in all forms and content delivery are important topics as well. When all the projects were compared, the most important issue was discovered to be SOA which leads the way to cloud computing.
Resumo:
The current research emphasizes on various questions raised and deliberated upon by different entrepreneurs. It provides a valuable contribution to comprehend the importance of social media and ICT-applications. Furthermore, it demonstrates how to support and implement the management consulting and business coaching start-ups with the help of social media and ICT-tools. The thesis presents a literary review from different information systems science, SME and e-business journals, web articles, as well as, survey analysis reports on social media applications. The methodology incorporated into a qualitative research method in which social anthropological approaches were used to oversee the case study activities in order to collect data. The collaborative social research approach was used to shelter the action research method. The research discovered that new business start-ups, as well as small businesses do not use social media and ICT-tools, unlike most of the large corporations use. At present, the current open-source ICT-technologies and social media applications are equally available for new and small businesses as they are available for larger companies. Successful implementation of social media and ICT-applications can easily enhance start-up performance and overcome business hassles. The thesis sheds some light on effective and innovative implementation of social media and ICT-applications for new business risk takers and small business birds. Key words