942 resultados para software, translation, validation tool, VMNET, Wikipedia, XML
Resumo:
Selection of reference genes is an essential consideration to increase the precision and quality of relative expression analysis by the quantitative RT-PCR method. The stability of eight expressed sequence tags was evaluated to define potential reference genes to study the differential expression of common bean target genes under biotic (incompatible interaction between common bean and fungus Colletotrichum lindemuthianum) and abiotic (drought; salinity; cold temperature) stresses. The efficiency of amplification curves and quantification cycle (C (q)) were determined using LinRegPCR software. The stability of the candidate reference genes was obtained using geNorm and NormFinder software, whereas the normalization of differential expression of target genes [beta-1,3-glucanase 1 (BG1) gene for biotic stress and dehydration responsive element binding (DREB) gene for abiotic stress] was defined by REST software. High stability was obtained for insulin degrading enzyme (IDE), actin-11 (Act11), unknown 1 (Ukn1) and unknown 2 (Ukn2) genes during biotic stress, and for SKP1/ASK-interacting protein 16 (Skip16), Act11, Tubulin beta-8 (beta-Tub8) and Unk1 genes under abiotic stresses. However, IDE and Act11 were indicated as the best combination of reference genes for biotic stress analysis, whereas the Skip16 and Act11 genes were the best combination to study abiotic stress. These genes should be useful in the normalization of gene expression by RT-PCR analysis in common bean, the most important edible legume.
Resumo:
Abstract Background From shotgun libraries used for the genomic sequencing of the phytopathogenic bacterium Xanthomonas axonopodis pv. citri (XAC), clones that were representative of the largest possible number of coding sequences (CDSs) were selected to create a DNA microarray platform on glass slides (XACarray). The creation of the XACarray allowed for the establishment of a tool that is capable of providing data for the analysis of global genome expression in this organism. Findings The inserts from the selected clones were amplified by PCR with the universal oligonucleotide primers M13R and M13F. The obtained products were purified and fixed in duplicate on glass slides specific for use in DNA microarrays. The number of spots on the microarray totaled 6,144 and included 768 positive controls and 624 negative controls per slide. Validation of the platform was performed through hybridization of total DNA probes from XAC labeled with different fluorophores, Cy3 and Cy5. In this validation assay, 86% of all PCR products fixed on the glass slides were confirmed to present a hybridization signal greater than twice the standard deviation of the deviation of the global median signal-to-noise ration. Conclusions Our validation of the XACArray platform using DNA-DNA hybridization revealed that it can be used to evaluate the expression of 2,365 individual CDSs from all major functional categories, which corresponds to 52.7% of the annotated CDSs of the XAC genome. As a proof of concept, we used this platform in a previously work to verify the absence of genomic regions that could not be detected by sequencing in related strains of Xanthomonas.
Resumo:
OBJECTIVE: The Prodromal Questionnaire (PQ) is a 92-item self-report screening tool for individuals at ultra-high risk (UHR) to develop psychosis. This study aims to present the translation to Portuguese and preliminary results in UHR and first episode (FE) psychosis in a Portuguese sample. METHODS: The PQ was translated from English to Portuguese by two bilingual researchers from the research program on early psychosis of the Instituto de Psiquiatria HCFMUSP, São Paulo, Brazil (ASAS - "Evaluation and Follow up of Adolescents and Young Adults in São Paulo") and back translated by two other researchers. The study participants (n = 11-) were evaluated through the Portuguese version of the Prodromal Questionnaire (PQ) and SIPS. RESULTS: The individuals at UHR (n = 7) presented a lower score than first episode patients (n = 4). The UHR mean scores and standard deviation on Portuguese version of the PQ were: 13.0 ± 10.0 points on positive symptoms subscale, and FE patients: 33.0 ± 10.0. CONCLUSION: The UHR and FE patients' of this study presented PQ scores similar to the ones found in the literature; what suggests that it is possible to use the PQ in Brazilian help-seeking individuals as a screening tool.
Resumo:
Computational fluid dynamics, CFD, is becoming an essential tool in the prediction of the hydrodynamic efforts and flow characteristics of underwater vehicles for manoeuvring studies. However, when applied to the manoeuvrability of autonomous underwater vehicles, AUVs, most studies have focused on the de- termination of static coefficients without considering the effects of the vehicle control surface deflection. This paper analyses the hydrodynamic efforts generated on an AUV considering the combined effects of the control surface deflection and the angle of attack using CFD software based on the Reynolds-averaged Navier–Stokes formulations. The CFD simulations are also independently conducted for the AUV bare hull and control surface to better identify their individual and interference efforts and to validate the simulations by comparing the experimental results obtained in a towing tank. Several simulations of the bare hull case were conducted to select the k –ω SST turbulent model with the viscosity approach that best predicts its hydrodynamic efforts. Mesh sensitivity analyses were conducted for all simulations. For the flow around the control surfaces, the CFD results were analysed according to two different methodologies, standard and nonlinear. The nonlinear regression methodology provides better results than the standard methodology does for predicting the stall at the control surface. The flow simulations have shown that the occurrence of the control surface stall depends on a linear relationship between the angle of attack and the control surface deflection. This type of information can be used in designing the vehicle’s autopilot system.
Resumo:
Degree in Marine Sciences. Faculty of Marine Sciences, University of Las Palmas de Gran Canaria. Institut de Ciències del Mar, Consejo Superior de Investigaciones Científicas
Resumo:
Background. The surgical treatment of dysfunctional hips is a severe condition for the patient and a costly therapy for the public health. Hip resurfacing techniques seem to hold the promise of various advantages over traditional THR, with particular attention to young and active patients. Although the lesson provided in the past by many branches of engineering is that success in designing competitive products can be achieved only by predicting the possible scenario of failure, to date the understanding of the implant quality is poorly pre-clinically addressed. Thus revision is the only delayed and reliable end point for assessment. The aim of the present work was to model the musculoskeletal system so as to develop a protocol for predicting failure of hip resurfacing prosthesis. Methods. Preliminary studies validated the technique for the generation of subject specific finite element (FE) models of long bones from Computed Thomography data. The proposed protocol consisted in the numerical analysis of the prosthesis biomechanics by deterministic and statistic studies so as to assess the risk of biomechanical failure on the different operative conditions the implant might face in a population of interest during various activities of daily living. Physiological conditions were defined including the variability of the anatomy, bone densitometry, surgery uncertainties and published boundary conditions at the hip. The protocol was tested by analysing a successful design on the market and a new prototype of a resurfacing prosthesis. Results. The intrinsic accuracy of models on bone stress predictions (RMSE < 10%) was aligned to the current state of the art in this field. The accuracy of prediction on the bone-prosthesis contact mechanics was also excellent (< 0.001 mm). The sensitivity of models prediction to uncertainties on modelling parameter was found below 8.4%. The analysis of the successful design resulted in a very good agreement with published retrospective studies. The geometry optimisation of the new prototype lead to a final design with a low risk of failure. The statistical analysis confirmed the minimal risk of the optimised design over the entire population of interest. The performances of the optimised design showed a significant improvement with respect to the first prototype (+35%). Limitations. On the authors opinion the major limitation of this study is on boundary conditions. The muscular forces and the hip joint reaction were derived from the few data available in the literature, which can be considered significant but hardly representative of the entire variability of boundary conditions the implant might face over the patients population. This moved the focus of the research on modelling the musculoskeletal system; the ongoing activity is to develop subject-specific musculoskeletal models of the lower limb from medical images. Conclusions. The developed protocol was able to accurately predict known clinical outcomes when applied to a well-established device and, to support the design optimisation phase providing important information on critical characteristics of the patients when applied to a new prosthesis. The presented approach does have a relevant generality that would allow the extension of the protocol to a large set of orthopaedic scenarios with minor changes. Hence, a failure mode analysis criterion can be considered a suitable tool in developing new orthopaedic devices.
Resumo:
Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.
Resumo:
[ES]Educational tool for training biomedical engineers in the biomedical signals processing field has been developed. It is software for simulation and study of the results obtained in biomedical signals when different signals processing techniques are applied. The tool has been implemented on a graphical user interface to facilitate the use.
Resumo:
[EN]One of the main issues of the current education system is the lack of student motivation. This aspect together with the permanent change that the Information and Communications Technologies involve represents a major challenge for the teacher: to continuously update contents and to keep awake the student’s interest. A tremendously useful tool in classrooms consists on the integration of projects with participative and collaborative dynamics, where the teacher acts mainly as a guidance to the student activity instead of being a mere knowledge and evaluation transmitter. As a specific example of project based learning, the EDUROVs project consists on building an economic underwater robot using low cost materials, but allowing the integration and programming of many accessories and sensors with minimum budget using opensource hardware and software.
Resumo:
Magnetic resonance imaging (MRI) is today precluded to patients bearing active implantable medical devices AIMDs). The great advantages related to this diagnostic modality, together with the increasing number of people benefiting from implantable devices, in particular pacemakers(PM)and carioverter/defibrillators (ICD), is prompting the scientific community the study the possibility to extend MRI also to implanted patients. The MRI induced specific absorption rate (SAR) and the consequent heating of biological tissues is one of the major concerns that makes patients bearing metallic structures contraindicated for MRI scans. To date, both in-vivo and in-vitro studies have demonstrated the potentially dangerous temperature increase caused by the radiofrequency (RF) field generated during MRI procedures in the tissues surrounding thin metallic implants. On the other side, the technical evolution of MRI scanners and of AIMDs together with published data on the lack of adverse events have reopened the interest in this field and suggest that, under given conditions, MRI can be safely performed also in implanted patients. With a better understanding of the hazards of performing MRI scans on implanted patients as well as the development of MRI safe devices, we may soon enter an era where the ability of this imaging modality may be more widely used to assist in the appropriate diagnosis of patients with devices. In this study both experimental measures and numerical analysis were performed. Aim of the study is to systematically investigate the effects of the MRI RF filed on implantable devices and to identify the elements that play a major role in the induced heating. Furthermore, we aimed at developing a realistic numerical model able to simulate the interactions between an RF coil for MRI and biological tissues implanted with a PM, and to predict the induced SAR as a function of the particular path of the PM lead. The methods developed and validated during the PhD program led to the design of an experimental framework for the accurate measure of PM lead heating induced by MRI systems. In addition, numerical models based on Finite-Differences Time-Domain (FDTD) simulations were validated to obtain a general tool for investigating the large number of parameters and factors involved in this complex phenomenon. The results obtained demonstrated that the MRI induced heating on metallic implants is a real risk that represents a contraindication in extending MRI scans also to patient bearing a PM, an ICD, or other thin metallic objects. On the other side, both experimental data and numerical results show that, under particular conditions, MRI procedures might be consider reasonably safe also for an implanted patient. The complexity and the large number of variables involved, make difficult to define a unique set of such conditions: when the benefits of a MRI investigation cannot be obtained using other imaging techniques, the possibility to perform the scan should not be immediately excluded, but some considerations are always needed.
Resumo:
MoNET e' un emulatore per reti wireless mobili, composto da una suite di software distribuiti. MoNET fornisce a ricercatori e sviluppatori un ambiente virtualizzato controllato per lo sviluppo e il test di applicazioni mobili e protocolli di rete per qualsiasi tipologia di hardware e piattaforma software che possa essere virtualizzata. La natura distribuita di questo emulatore permette di creare scenari di dimensione arbitraria. La rete wireless viene emulata in maniera trasparente, quindi la connettività percepita da ogni nodo virtuale, presenta le stesse caratteristiche di quella fisica emulata.
Resumo:
This work describes the development of a simulation tool which allows the simulation of the Internal Combustion Engine (ICE), the transmission and the vehicle dynamics. It is a control oriented simulation tool, designed in order to perform both off-line (Software In the Loop) and on-line (Hardware In the Loop) simulation. In the first case the simulation tool can be used in order to optimize Engine Control Unit strategies (as far as regard, for example, the fuel consumption or the performance of the engine), while in the second case it can be used in order to test the control system. In recent years the use of HIL simulations has proved to be very useful in developing and testing of control systems. Hardware In the Loop simulation is a technology where the actual vehicles, engines or other components are replaced by a real time simulation, based on a mathematical model and running in a real time processor. The processor reads ECU (Engine Control Unit) output signals which would normally feed the actuators and, by using mathematical models, provides the signals which would be produced by the actual sensors. The simulation tool, fully designed within Simulink, includes the possibility to simulate the only engine, the transmission and vehicle dynamics and the engine along with the vehicle and transmission dynamics, allowing in this case to evaluate the performance and the operating conditions of the Internal Combustion Engine, once it is installed on a given vehicle. Furthermore the simulation tool includes different level of complexity, since it is possible to use, for example, either a zero-dimensional or a one-dimensional model of the intake system (in this case only for off-line application, because of the higher computational effort). Given these preliminary remarks, an important goal of this work is the development of a simulation environment that can be easily adapted to different engine types (single- or multi-cylinder, four-stroke or two-stroke, diesel or gasoline) and transmission architecture without reprogramming. Also, the same simulation tool can be rapidly configured both for off-line and real-time application. The Matlab-Simulink environment has been adopted to achieve such objectives, since its graphical programming interface allows building flexible and reconfigurable models, and real-time simulation is possible with standard, off-the-shelf software and hardware platforms (such as dSPACE systems).
Resumo:
This doctoral thesis focuses on ground-based measurements of stratospheric nitric acid (HNO3)concentrations obtained by means of the Ground-Based Millimeter-wave Spectrometer (GBMS). Pressure broadened HNO3 emission spectra are analyzed using a new inversion algorithm developed as part of this thesis work and the retrieved vertical profiles are extensively compared to satellite-based data. This comparison effort I carried out has a key role in establishing a long-term (1991-2010), global data record of stratospheric HNO3, with an expected impact on studies concerning ozone decline and recovery. The first part of this work is focused on the development of an ad hoc version of the Optimal Estimation Method (Rodgers, 2000) in order to retrieve HNO3 spectra observed by means of GBMS. I also performed a comparison between HNO3 vertical profiles retrieved with the OEM and those obtained with the old iterative Matrix Inversion method. Results show no significant differences in retrieved profiles and error estimates, with the OEM providing however additional information needed to better characterize the retrievals. A final section of this first part of the work is dedicated to a brief review on the application of the OEM to other trace gases observed by GBMS, namely O3 and N2O. The second part of this study deals with the validation of HNO3 profiles obtained with the new inversion method. The first step has been the validation of GBMS measurements of tropospheric opacity, which is a necessary tool in the calibration of any GBMS spectra. This was achieved by means of comparisons among correlative measurements of water vapor column content (or Precipitable Water Vapor, PWV) since, in the spectral region observed by GBMS, the tropospheric opacity is almost entirely due to water vapor absorption. In particular, I compared GBMS PWV measurements collected during the primary field campaign of the ECOWAR project (Bhawar et al., 2008) with simultaneous PWV observations obtained with Vaisala RS92k radiosondes, a Raman lidar, and an IR Fourier transform spectrometer. I found that GBMS PWV measurements are in good agreement with the other three data sets exhibiting a mean difference between observations of ~9%. After this initial validation, GBMS HNO3 retrievals have been compared to two sets of satellite data produced by the two NASA/JPL Microwave Limb Sounder (MLS) experiments (aboard the Upper Atmosphere Research Satellite (UARS) from 1991 to 1999, and on the Earth Observing System (EOS) Aura mission from 2004 to date). This part of my thesis is inserted in GOZCARDS (Global Ozone Chemistry and Related Trace gas Data Records for the Stratosphere), a multi-year project, aimed at developing a long-term data record of stratospheric constituents relevant to the issues of ozone decline and expected recovery. This data record will be based mainly on satellite-derived measurements but ground-based observations will be pivotal for assessing offsets between satellite data sets. Since the GBMS has been operated for more than 15 years, its nitric acid data record offers a unique opportunity for cross-calibrating HNO3 measurements from the two MLS experiments. I compare GBMS HNO3 measurements obtained from the Italian Alpine station of Testa Grigia (45.9° N, 7.7° E, elev. 3500 m), during the period February 2004 - March 2007, and from Thule Air Base, Greenland (76.5°N 68.8°W), during polar winter 2008/09, and Aura MLS observations. A similar intercomparison is made between UARS MLS HNO3 measurements with those carried out from the GBMS at South Pole, Antarctica (90°S), during the most part of 1993 and 1995. I assess systematic differences between GBMS and both UARS and Aura HNO3 data sets at seven potential temperature levels. Results show that, except for measurements carried out at Thule, ground based and satellite data sets are consistent within the errors, at all potential temperature levels.
Resumo:
The objective of this thesis was to improve the commercial CFD software Ansys Fluent to obtain a tool able to perform accurate simulations of flow boiling in the slug flow regime. The achievement of a reliable numerical framework allows a better understanding of the bubble and flow dynamics induced by the evaporation and makes possible the prediction of the wall heat transfer trends. In order to save computational time, the flow is modeled with an axisymmetrical formulation. Vapor and liquid phases are treated as incompressible and in laminar flow. By means of a single fluid approach, the flow equations are written as for a single phase flow, but discontinuities at the interface and interfacial effects need to be accounted for and discretized properly. Ansys Fluent provides a Volume Of Fluid technique to advect the interface and to map the discontinuous fluid properties throughout the flow domain. The interfacial effects are dominant in the boiling slug flow and the accuracy of their estimation is fundamental for the reliability of the solver. Self-implemented functions, developed ad-hoc, are introduced within the numerical code to compute the surface tension force and the rates of mass and energy exchange at the interface related to the evaporation. Several validation benchmarks assess the better performances of the improved software. Various adiabatic configurations are simulated in order to test the capability of the numerical framework in modeling actual flows and the comparison with experimental results is very positive. The simulation of a single evaporating bubble underlines the dominant effect on the global heat transfer rate of the local transient heat convection in the liquid after the bubble transit. The simulation of multiple evaporating bubbles flowing in sequence shows that their mutual influence can strongly enhance the heat transfer coefficient, up to twice the single phase flow value.
Resumo:
This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.