944 resultados para Tool Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing need to patrol and survey large maritime and terrestrial areas increased the need to integrate external sensors on aircraft in order to accomplish those patrols at increasingly higher altitudes, longer range and not depending upon vehicle type. The main focus of this work is to elaborate a practical, simple, effective and efficient methodology for the aircraft modification procedure resulting from the integration of an Elec-tro-Optical/Infra-Red (EO/IR) turret through a support structure. The importance of the devel-opment of a good methodology relies on the correct management of project variables as time, available resources and project complexity. The key is to deliver a proper tool for a project de-sign team that will be used to create a solution that fulfils all technical, non-technical and certi-fication requirements present in this field of transportation. The created methodology is inde-pendent of two main inputs: sensor model and aircraft model definition, and therefore it is in-tended to deliver the results for different projects besides the one that was presented in this work as a case study. This particular case study presents the development of a structure support for FLIR STAR SAPHIRE III turret integration on the front lower fuselage bulkhead (radome) of the LOCKHEED MARTIN C-130 H. Development of the case study focuses on the study of local structural analysis through the use of Finite Element Method (FEM). Development of this Dissertation resulted in a cooperation between Faculty of Science and Technology - Universidade Nova de Lisboa and the company OGMA - Indústria Aeronáutica de Portugal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work the benefits of using graphics processing units (GPU) to aid the design of complex geometry profile extrusion dies, are studied. For that purpose, a3Dfinite volume based code that employs unstructured meshes to solve and couple the continuity, momentum and energy conservation equations governing the fluid flow, together with aconstitutive equation, was used. To evaluate the possibility of reducing the calculation time spent on the numerical calculations, the numerical code was parallelized in the GPU, using asimple programing approach without complex memory manipulations. For verificationpurposes, simulations were performed for three benchmark problems: Poiseuille flow, lid-driven cavity flow and flow around acylinder. Subsequently, the code was used on the design of two real life extrusion dies for the production of a medical catheter and a wood plastic composite decking profile. To evaluate the benefits, the results obtained with the GPU parallelized code were compared, in terms of speedup, with a serial implementation of the same code, that traditionally runs on the central processing unit (CPU). The results obtained show that, even with the simple parallelization approach employed, it was possible to obtain a significant reduction of the computation times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PhD thesis in Bioengineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Design e Marketing

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aquesta memoria resumeix el treball de final de carrera d’Enginyeria Superior d’Informàtica. Explicarà les principals raons que han motivat el projecte així com exemples que il·lustren l’aplicació resultant. En aquest cas el software intentarà resoldre la actual necessitat que hi ha de tenir dades de Ground Truth per als algoritmes de segmentació de text per imatges de color complexes. Tots els procesos seran explicats en els diferents capítols partint de la definició del problema, la planificació, els requeriments i el disseny fins a completar la il·lustració dels resultats del programa i les dades de Ground Truth resultants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PADICAT is the web archive created in 2005 in Catalonia (Spain ) by the Library of Catalonia (BC ) , the National Library of Catalonia , with the aim of collecting , processing and providing permanent access to the digital heritage of Catalonia . Its harvesting strategy is based on the hybrid model ( of massive harvesting . SPA top level domain ; selective compilation of the web site output of Catalan organizations; focused harvesting of public events) . The system provides open access to the whole collection , on the Internet . We consider necessary to complement the current search for new and visualization software with open source software tool, CAT ( Curator Archiving Tool) , composed by three modules aimed to effectively managing the processes of human cataloguing ; to publish directories where the digital resources and special collections ; and to offer statistical information of added value to end users. Within the framework of the International Internet Preservation Consortium meeting ( Vienna 2010) , the progress in the development of this new tool, and the philosophy that has motivated his design, are presented to the international community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this pilot study was to describe problems in functioning and associated rehabilitation needs in persons with spinal cord injury after the 2010 earthquake in Haiti by applying a newly developed tool based on the International Classification of Functioning, Disability and Health (ICF). DESIGN: Pilot study. SUBJECTS: Eighteen persons with spinal cord injury (11 women, 7 men) participated in the needs assessment. Eleven patients had complete lesions (American Spinal Injury Association Impairment Scale; AIS A), one patient had tetraplegia. METHODS: Data collection included information from the International Spinal Cord Injury Core Data Set and a newly developed needs assessment tool based on ICF Core Sets. This tool assesses the level of functioning, the corresponding rehabilitation need, and required health professional. Data were summarized using descriptive statistics. RESULTS: In body functions and body structures, patients showed typical problems following spinal cord injury. Nearly all patients showed limitations and restrictions in their activities and participation related to mobility, self-care and aspects of social integration. Several environmental factors presented barriers to these limitations and restrictions. However, the availability of products and social support were identified as facilitators. Rehabilitation needs were identified in nearly all aspects of functioning. To address these needs, a multidisciplinary approach would be needed. CONCLUSION: This ICF-based needs assessment provided useful information for rehabilitation planning in the context of natural disaster. Future studies are required to test and, if necessary, adapt the assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the molecular markers commonly used for mosquito taxonomy, the internal transcribed spacer 2 (ITS2) of the ribosomal DNA is useful for distinguishing among closely-related species. Here we review 178 GenBank accession numbers matching ITS2 sequences of Latin American anophelines. Among those, we found 105 unique sequences corresponding to 35 species. Overall the ITS2 sequences distinguish anopheline species, however, information on intraspecific and geographic variations is scarce. Intraspecific variations ranged from 0.2% to 19% and our analysis indicates that misidentification and/or sequencing errors could be responsible for some of the high values of divergence. Research in Latin American malaria vector taxonomy profited from molecular data provided by single or few field capture mosquitoes. However we propose that caution should be taken and minimum requirements considered in the design of additional studies. Future studies in this field should consider that: (1) voucher specimens, assigned to the DNA sequences, need to be deposited in collections, (2) intraspecific variations should be thoroughly evaluated, (3) ITS2 and other molecular markers, considered as a group, will provide more reliable information, (4) biological data about vector populations are missing and should be prioritized, (5) the molecular markers are most powerful when coupled with traditional taxonomic tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. During the last few years, PCR-based methods have been developed to simplify and reduce the time required for genotyping Mycobacterium tuberculosis (MTB) by standard approaches based on IS6110-Restriction Fragment Length Polymorphism (RFLP). Of these, MIRU-12-VNTR (Mycobacterial interspersed repetitive units- variable number of tandem repeats) (MIRU-12) has been considered a good alternative. Nevertheless, some limitations and discrepancies with RFLP, which are minimized if the technique is complemented with spoligotyping, have been found. Recently, a new version of MIRU-VNTR targeting 15 loci (MIRU-15) has been proposed to improve the MIRU-12 format. Results. We evaluated the new MIRU-15 tool in two different samples. First, we analyzed the same convenience sample that had been used to evaluate MIRU-12 in a previous study, and the new 15-loci version offered higher discriminatory power (Hunter-Gaston discriminatory index [HGDI]: 0.995 vs 0.978; 34.4% of clustered cases vs 57.5%) and better correlation (full or high correlation with RFLP for 82% of the clusters vs 47%). Second, we evaluated MIRU-15 on a population-based sample and, once again, good correlation with the RFLP clustering data was observed (for 83% of the RFLP clusters). To understand the meaning of the discrepancies still found between MIRU-15 and RFLP, we analyzed the epidemiological data for the clustered patients. In most cases, splitting of RFLP-clustered patients by MIRU-15 occurred for those without epidemiological links, and RFLP-clustered patients with epidemiological links were also clustered by MIRU-15, suggesting a good epidemiological background for clustering defined by MIRU-15. Conclusion. The data obtained by MIRU-15 suggest that the new design is very efficient at assigning clusters confirmed by epidemiological data. If we add this to the speed with which it provides results, MIRU-15 could be considered a suitable tool for real-time genotyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Web-based tool developed to automatically correct relational database schemas is presented. This tool has been integrated into a more general e-learning platform and is used to reinforce teaching and learning on database courses. This platform assigns to each student a set of database problems selected from a common repository. The student has to design a relational database schema and enter it into the system through a user friendly interface specifically designed for it. The correction tool corrects the design and shows detected errors. The student has the chance to correct them and send a new solution. These steps can be repeated as many times as required until a correct solution is obtained. Currently, this system is being used in different introductory database courses at the University of Girona with very promising results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Several questionnaires have been used to measure health related quality of life (HRQoL) in patients with psoriasis, few have been adapted for use in Spain; none of them was developed specifically for the Spanish population. The purpose of the study was to validate and assess the sensitivity to change of a new questionnaire to measure HRQOL in patients with psoriasis (PSO-LIFE). METHODS Observational, prospective, multicenter study performed in centers around Spain. Patients with active or inactive psoriasis completed the PSO-LIFE together with other Dermatology Quality of Life Index (DLQI) and Psoriasis Disability Index (PDI). A control group of patients with urticaria or atopic dermatitis was also included. Internal consistency and test-retest reliability of the PSO-LIFE were assessed by calculating Cronbach's alpha and Intraclass Correlation Coefficient (ICC). Validity was assessed by examining factorial structure, the capacity to discriminate between groups, and correlations with other measures. Sensitivity to change was measured using effect sizes. RESULTS The final sample included for analysis consisted of 304 patients and 56 controls. Mean (SD) age of psoriasis patients was 45.3 (14.5) years compared to 38.8 (14) years for controls (p < 0.01). Cronbach's alpha for the PSO-LIFE was 0.95 and test-retest reliability using the ICC was 0.98. Factor analysis showed the questionnaire to be unidimensional. Mean (SD) PSO-LIFE scores differed between patients with psoriasis and controls (64.9 [22.5] vs 69.4 [17.3]; p < 0.05), between those with active and inactive disease (57.4 [20.4] vs 76.4 [20.6]; p < 0.01), and between those with visible and non-visible lesions (63.0 [21.9] vs. 74.8 [23.9]; p < 0.01). The correlation between PSO-LIFE and PASI scores was moderate (r = -0.43) while correlations with DLQI and PDI dimensions ranged from moderate to high (between 0.4 and 0.8). Effect size on the PSO-LIFE in patients reporting 'much improved' health status at study completion was 1.01 (large effect size). CONCLUSIONS The present results provide substantial support for the reliability, validity, and responsiveness of the PSO-LIFE questionnaire in the population for which it was designed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces Collage, a high-level IMS-LD compliant authoring tool that is specialized for CSCL (Computer-Supported Collaborative Learning). Nowadays CSCL is a key trend in elearning since it highlights the importance of social interactions as an essential element of learning. CSCL is an interdisciplinary domain, which demands participatory design techniques that allow teachers to get directly involved in design activities. Developing CSCL designs using LD is a difficult task for teachers since LD is a complex technical specification and modelling collaborative characteristics can be tricky. Collage helps teachers in the process of creating their own potentially effective collaborative Learning Designs by reusing and customizing patterns, according to the requirements of a particular learning situation. These patterns, called Collaborative Learning Flow Patterns (CLFPs), represent best practices that are repetitively used by practitioners when structuring the flow of (collaborative) learning activities. An example of an LD that can be created using Collage is illustrated in the paper. Preliminary evaluation results show that teachers, with experience in CL but without LD knowledge, can successfully design real collaborative learning experiences using Collage.