961 resultados para Study platform
Resumo:
In this paper we present an experimental validation of the reliability increase of digital circuits implemented in XilinxTMFPGAs when they are implemented using the DSPs (Digital Signal Processors) that are available in the reconfigurable device. For this purpose, we have used a fault-injection platform developed by our research group, NESSY [1]. The presented experiments demonstrate that the probability of occurrence of a SEU effect is similar both in the circuits implemented with and without using embedded DSPs. However, the former are more efficient in terms of area usage, which leads to a decrease in the probability of a SEU occurrence.
Resumo:
Developing a fast, inexpensive, and specific test that reflects the mutations present in Mycobacterium tuberculosis isolates according to geographic region is the main challenge for drug-resistant tuberculosis (TB) control. The objective of this study was to develop a molecular platform to make a rapid diagnosis of multidrug-resistant (MDR) and extensively drug-resistant TB based on single nucleotide polymorphism (SNP) mutations present in the rpoB, katG, inhA, ahpC, and gyrA genes from Colombian M. tuberculosis isolates. The amplification and sequencing of each target gene was performed. Capture oligonucleotides, which were tested before being used with isolates to assess the performance, were designed for wild type and mutated codons, and the platform was standardised based on the reverse hybridisation principle. This method was tested on DNA samples extracted from clinical isolates from 160 Colombian patients who were previously phenotypically and genotypically characterised as having susceptible or MDR M. tuberculosis. For our method, the kappa index of the sequencing results was 0,966, 0,825, 0,766, 0,740, and 0,625 for rpoB, katG, inhA, ahpC, and gyrA, respectively. Sensitivity and specificity were ranked between 90-100% compared with those of phenotypic drug susceptibility testing. Our assay helps to pave the way for implementation locally and for specifically adapted methods that can simultaneously detect drug resistance mutations to first and second-line drugs within a few hours.
Resumo:
One of the objectives of this study is to perform classification of socio-demographic components for the level of city section in City of Lisbon. In order to accomplish suitable platform for the restaurant potentiality map, the socio-demographic components were selected to produce a map of spatial clusters in accordance to restaurant suitability. Consequently, the second objective is to obtain potentiality map in terms of underestimation and overestimation in number of restaurants. To the best of our knowledge there has not been found identical methodology for the estimation of restaurant potentiality. The results were achieved with combination of SOM (Self-Organized Map) which provides a segmentation map and GAM (Generalized Additive Model) with spatial component for restaurant potentiality. Final results indicate that the highest influence in restaurant potentiality is given to tourist sites, spatial autocorrelation in terms of neighboring restaurants (spatial component), and tax value, where lower importance is given to household with 1 or 2 members and employed population, respectively. In addition, an important conclusion is that the most attractive market sites have shown no change or moderate underestimation in terms of restaurants potentiality.
Resumo:
Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.
Resumo:
Visualization of vector fields plays an important role in research activities nowadays -- Web applications allow a fast, multi-platform and multi-device access to data, which results in the need of optimized applications to be implemented in both high-performance and low-performance devices -- Point trajectory calculation procedures usually perform repeated calculations due to the fact that several points might lie over the same trajectory -- This paper presents a new methodology to calculate point trajectories over highly-dense and uniformly-distributed grid of points in which the trajectories are forced to lie over the points in the grid -- Its advantages rely on a highly parallel computing architecture implementation and in the reduction of the computational effort to calculate the stream paths since unnecessary calculations are avoided, reusing data through iterations -- As case study, the visualization of oceanic currents through in the web platform is presented and analyzed, using WebGL as the parallel computing architecture and the rendering Application Programming Interface
Resumo:
Tese de Doutoramento em Ciências Veterinárias, Especialidade de Ciências Biológicas e Biomédicas
Resumo:
Tese de Doutoramento em Ciências Veterinárias na Especialidade de Ciências Biológicas e Biomédicas
Resumo:
Technologies for Big Data and Data Science are receiving increasing research interest nowadays. This paper introduces the prototyping architecture of a tool aimed to solve Big Data Optimization problems. Our tool combines the jMetal framework for multi-objective optimization with Apache Spark, a technology that is gaining momentum. In particular, we make use of the streaming facilities of Spark to feed an optimization problem with data from different sources. We demonstrate the use of our tool by solving a dynamic bi-objective instance of the Traveling Salesman Problem (TSP) based on near real-time traffic data from New York City, which is updated several times per minute. Our experiment shows that both jMetal and Spark can be integrated providing a software platform to deal with dynamic multi-optimization problems.
Resumo:
Presentation Research of the Practicum and externships has a long history and involves important aspects for analysis. For example, the recent changes taking place in university grades allot more credits to the Practicum course in all grades, and the Company-University collaboration has exposed the need to study in new learning environments. The rise of ICT practices like ePortfolios, which require technological solutions and methods supported by experimentation, study and research, require particular examination due to the dynamic momentum of technological innovation. Tutoring the Practicum and externships requires remote monitoring and communication using ePortfolios, and competence-based assessment and students’ requirement to provide evidence of learning require the best tutoring methods available with ePortfolios. Among the elements of ePortfolios, eRubrics emerge as a tool for design, communication and competence-assessment. This project aims to consolidate a research line on eRubrics, already undertaken by another project -I+D+i [EDU2010-15432]- in order to expand the network of researchers and Centres of Excellence in Spain and other countries: Harvard University in USA, University of Cologne in Germany, University of Colima in Mexico, Federal University of Parana, University of Santa Catarina in Brasil, and Stockholm University in Sweden(1). This new project [EDU2013-41974-P](2) examines the impact of eRubrics on tutoring and on assessing the Practicum course and externships. Through technology, distance tutoring grants an extra dimension to human communication. New forms of teaching with technological mediation are on the rise and are highly valuable, not only for formal education but especially in both public and private sectors of non-formal education, such as occupational training, unemployed education and public servant training. Objectives Obj. 1. To analyse models of technology used in assessing learning in the Practicum of all grades at Spanish Faculties of Education. Obj. 2. To study models of learning assessment measured by eRubrics in the Practicum. Obj. 3. To analyse communication through eRubrics between students and their tutors at university and practice centres, focusing on students’ understanding of competences and evidences to be assessed in the Practicum. Obj. 4. To design assessment services and products, in order to federate companies and practice centres with training institutions. Among many other features, it has the following functions CoRubric(3) 1. The possibility to assess people, products or services by using rubrics. 2. Ipsative assessment. 3. Designing fully flexible rubrics. 4. Drafting reports and exporting results from eRubrics in a project. 5. Students and teachers talk about the evaluation and application of the criteria Methodology, Methods, Research Instruments or Sources Used The project will use techniques to collect and analyse data from two methodological approaches: 1. In order to meet the first objective, we suggest an initial exploratory descriptive study (Buendía Eisman, Colás Bravo & Hernández Pina, 1998), which involves conducting interviews with Practicum coordinators from all educational grades across Spain, as well as analysing the contents of the teaching guides used in all educational grades across Spain. 55 academic managers were interviewed from about 10 faculties of education in public universities in Spain (20%), and course guides 376 universities from 36 public institutions in Spain (72%) are analyzed. 2. In order to satisfy the second objective, 7 universities have been selected to implement the project two instruments aimed at tutors practice centers and tutors of the faculty. All instruments for collecting data were validated by experts using the Delphi method. The selection of experts had three aspects: years of professional experience, number and quality of publications in the field (Practicum, Educational Technology and Teacher Training), and self-rating of their knowledge. The resulting data was calculated using the Coefficient of Competence (Kcomp) (Martínez, Zúñiga, Sala & Meléndez, 2012). Results in all cases showed an average experience of more than 0.09 points. The two instruments of the first objective were validated during the first half of 2014-15 year, data collected during the second half. And the second objective during the first half of 2015-16 year and data collection for the second half. The set of four instruments (two for each objective 1 and 2) have the same dimensions as each of the sources (Coordinators, course guides, tutors of practice centers and faculty) as they were: a. Institution-Organization, b. Nature of internships, c. Relationship between agents, d. Management Practicum, e. Assessment. F. Technological support, g. Training and h. Assessment Ethics. Conclusions, Expected Outcomes or Findings The first results respond to Objective 1, where we find different conclusions depending on each of the six dimensions. In the case of internal regulations governing the organization and structure of the practicum, we note that most traditional degrees (Elementary and Primary grades) share common internal rules, in particular development methodology and criteria against other grades (Pedagogy and Social Education ). It is also true that the centers of practices in last cases are very different from each other and can be a public institution, a school, a company, a museum, etc. The memory with a 56.34% and 43.67% daily activities are more demands on students in all degrees, Lesson plans 28.18% 19.72% Portfolio 26.7% Didactic units and Others 32,4%. The technical support has been mainly used the platform of the University 47.89% and 57.75% Email, followed by other services and tools 9.86% and rubric platforms 1.41%. The assessment criteria are divided between formal aspects of 12.38%, Written expresión 12.38%, treatment of the subject 14.45%, methodological rigor of work 10.32%, and Level of argument Clarity and relevance of conclusions 10.32%. In general terms, we could say that there is a trend and debate between formative assessment against a accreditation. It has not yet had sufficient time to further study and confront other dimensions and sources of information. We hope to provide more analysis and conclusions in the conference date.
Resumo:
Surface Plasmon Resonance (SPR) and localized surface plasmon resonance (LSPR) biosensors have brought a revolutionary change to in vitro study of biological and biochemical processes due to its ability to measure extremely small changes in surface refractive index (RI), binding equilibrium and kinetics. Strategies based on LSPR have been employed to enhance the sensitivity for a variety of applications, such as diagnosis of diseases, environmental analysis, food safety, and chemical threat detection. In LSPR spectroscopy, absorption and scattering of light are greatly enhanced at frequencies that excite the LSPR, resulting in a characteristic extinction spectrum that depends on the RI of the surrounding medium. Compositional and conformational change within the surrounding medium near the sensing surface could therefore be detected as shifts in the extinction spectrum. This dissertation specifically focuses on the development and evaluation of highly sensitive LSPR biosensors for in situ study of biomolecular binding process by incorporating nanotechnology. Compared to traditional methods for biomolecular binding studies, LSPR-based biosensors offer real-time, label free detection. First, we modified the gold sensing surface of LSPR-based biosensors using nanomaterials such as gold nanoparticles (AuNPs) and polymer to enhance surface absorption and sensitivity. The performance of this type of biosensors was evaluated on the application of small heavy metal molecule binding affinity study. This biosensor exhibited ~7 fold sensitivity enhancement and binding kinetics measurement capability comparing to traditional biosensors. Second, a miniaturized cell culture system was integrated into the LSPR-based biosensor system for the purpose of real-time biomarker signaling pathway studies and drug efficacy studies with living cells. To the best of our knowledge, this is the first LSPR-based sensing platform with the capability of living cell studies. We demonstrated the living cell measurement ability by studying the VEGF signaling pathway in living SKOV-3 cells. Results have shown that the VEGF secretion level from SKOV-3 cells is 0.0137 ± 0.0012 pg per cell. Moreover, we have demonstrated bevacizumab drug regulation to the VEGF signaling pathway using this biosensor. This sensing platform could potentially help studying biomolecular binding kinetics which elucidates the underlying mechanisms of biotransportation and drug delivery.
Resumo:
Levulinic acid (LA) is a polyfunctional molecule obtained from biomass. Because of its structure, the United States Department of energy classified LA as one of the top 12 building block chemicals. Typically, it is valorized through chemical reduction to γ-valerolactone (GVL). It is usually done with H2 in batch systems with high H2 pressures and noble metal catalysts, making it expensive and less applicable. Therefore, alternative approaches such as catalytic transfer hydrogenation (CTH) through the Meerwein–Ponndorf–Verley (MPV) reaction over heterogeneous catalysts have been studied. This uses organic molecules (alcohols) which act as a hydride transfer agent (H-donor), to reduce molecules containing carbonyl groups. Given the stability of the intermediate, reports have shown the batch liquid-phase CTH of levulinate esters with secondary alcohols, and remarkable results (GVL yield) have been obtained over ZrO2, given the need of a Lewis acid (LASites) and base pair for CTH. However, there were no reports of the continuous gas-phase CTH of levulinate esters. Therefore, high surface area ZrO2 was tested for gas-phase CTH of methyl levulinate (ML) using ethanol, methanol and isopropanol as H-donors. Under optimized conditions with ethanol (250 ℃), the reaction is selective towards GVL (yield 70%). However, heavy carbonaceous materials over the catalyst surface progressively blocked LASites changing the chemoselectivity. The in situ regeneration of the catalyst permitted a partial recovery of the LASites and an almost total recovery of the initial catalytic behavior, proving the deactivation reversible. Tests with methanol were not promising (ML conversion 35%, GVL yield 4%). As expected, using isopropanol provided complete conversion and a GVL yield of 80%. The reaction was also tested using bioethanol derived from agricultural waste. In addition, a preliminary study was performed for the hydrogenolysis of polyols to produce bioethanol, were Pd-Fe catalyst promoted the ethanol selective (37%) hydrogenolysis of glycerol.
Resumo:
When the offshore oil and gas supplies exhaust, offshore platforms must be decommissioned and removed. The present thesis highlights the importance of evaluating the possibility of reuse of decommissioned offshore jacket platforms for offshore wind energy. In order to shift to the new structure, the topside must be removed from the substructure and a wind turbine can be installed in its place. The feasibility of this project was investigated using a finite element analysis software called Sesam. To study fatigue life in offshore structures, an exhaustive review of the background and state of the art was done. A finite element model was created by the means of Sesam and two different fatigue analysis approaches were applied and compared. In the end, an analysis methodology is suggested for the structural fatigue analysis of offshore wind turbine structures based on international standards, addressing the industry’s need to account for the combined effect of wind and hydrodynamic loads in these type of structures.
Resumo:
When the offshore oil and gas supplies exhaust, most offshore platforms are decommissioned and removed. The purpose of this paper is to evaluate the fatigue damage that will occur during the service life of a jacket-type offshore platform using different fatigue approaches in particular locations. The locations considered for this metocean climate impact study were Norway (North Sea), Portugal (Atlantic Ocean - Leixões) and Italy (Adriatic Sea). A finite element model was created by the means of Sesam and two different fatigue analysis, deterministic and spectral, were applied. For the fatigue assessment, an appropriate description of the site-specific wave environment, during the jacket platform service life, must be accomplished. This description is usually provided by a wave scatter diagram. Wave scatter diagrams usually represent the long-term wave environment during a (typical) year and are based on several years of site-specific data to ensure that they adequately represent the wave environment at the location of the structure. In this thesis, the comparison between these fatigue approaches will serve as a pilot study for planned reliability analysis in decommissioned offshore platforms in order to maximize the reuse of these platforms for future wind generation systems.
Resumo:
Flaring has been widely used in the upstream operation of the oil and gas industry, both onshore and offshore. It is considered a safe and reliable way to protect assets from overpressure and the environment from toxic gas using combustion. However, there are drawbacks to using flares, such as vibration and thermal radiation. Excessive contact with thermal radiation is harmful to offshore personnel and equipment. Research organizations and companies have invested time and money to combat this. Many technologies have been developed so far to reduce the risk of thermal radiation, one of them being the water curtain system. Several tests were done to see the effectiveness of the water curtain system in mitigating thermal radiation in an offshore environment. Each test varied in the flare output, wind speed, and the size of water droplets size of the water curtain. Later, the results of each test were compared and analyzed. The results showed that a water curtain system could be a solution to excessive thermal radiation that comes from an offshore flare. Moreover, the water curtain with smaller water droplets diameter gives a more favorable result in reducing thermal radiation. These results suggest that, although it offers simplicity and efficiency, designing an efficient water curtain system requires deep study. Various conditions, such as wind speed, flare intensity, and the size of the water droplets, plays a vital role in the effectiveness of the water curtain system in attenuating thermal radiation.
Resumo:
Ultracold gases provide an ideal platform for quantum simulations of many-body systems. Here we are interested in a particular system which has been the focus of most experimental and theoretical works on ultracold fermionic gases: the unitary Fermi gas. In this work we study with Quantum Monte Carlo simulations a two-component gas of fermionic atoms at zero temperature in the unitary regime. Specifically, we are interested in studying how the effective masses for the quasi-particles of the two components of the Fermi liquid evolve as the polarization is progressively reduced from full to lower values. A recent theoretical work, based on alternative diagrammatic methods, has indeed suggested that such effective masses should diverge at a critical polarization. To independently verify such predictions, we perform Variational Monte Carlo (VMC) calculations of the energy based on Jastrow-Slater wavefunctions after adding or subtracting a particle with a given momentum to a full Fermi sphere. In this way, we determine the quasi-particle dispersions, from which we extract the effective masses for different polarizations. The resulting effective masses turn out to be quite close to the non-interacting values, even though some evidence of an increase for the effective mass of the minority component appears close to the predicted value for the critical polarization. Preliminary results obtained for the majority component with the Fixed-node Diffusion Monte Carlo (DMC) method seem to indicate that DMC could lead to an increase of the effective masses in comparison with the VMC results. Finally, we point out further improvements of the trial wave-function and boundary conditions that would be necessary in future simulations to draw definite conclusions on the effective masses of the polarized unitary Fermi gas.