15 resultados para single strand conformation polymorphism analysis
em Instituto Politécnico do Porto, Portugal
Resumo:
Objective Deregulation of FAS/FASL system may lead to immune escape and influence bacillus Calmette-Guérin (BCG) immunotherapy outcome, which is currently the gold standard adjuvant treatment for high-risk non–muscle invasive bladder tumors. Among other events, functional promoter polymorphisms of FAS and FASL genes may alter their transcriptional activity. Therefore, we aim to evaluate the role of FAS and FASL polymorphisms in the context of BCG therapy, envisaging the validation of these biomarkers to predict response. Patients and methods DNA extracted from peripheral blood from 125 patients with bladder cancer treated with BCG therapy was analyzed by Polymerase Chain Reaction—Restriction Fragment Length Polymorphism for FAS-670 A/G and FASL-844 T/C polymorphisms. FASL mRNA expression was analyzed by real-time Polymerase Chain Reaction. Results Carriers of FASL-844 CC genotype present a decreased recurrence-free survival after BCG treatment when compared with FASL-844 T allele carriers (mean 71.5 vs. 97.8 months, P = 0.030) and have an increased risk of BCG treatment failure (Hazard Ratio = 1.922; 95% Confidence Interval: [1.064–3.471]; P = 0.030). Multivariate analysis shows that FASL-844 T/C and therapeutics scheme are independent predictive markers of recurrence after treatment. The evaluation of FASL gene mRNA levels demonstrated that patients carrying FASL-844 CC genotype had higher FASL expression in bladder tumors (P = 0.0027). Higher FASL levels were also associated with an increased risk of recurrence after BCG treatment (Hazard Ratio = 2.833; 95% Confidence Interval: [1.012–7.929]; P = 0.047). FAS-670 A/G polymorphism analysis did not reveal any association with BCG therapy outcome. Conclusions Our results suggest that analysis of FASL-844 T/C, but not FAS-670 A/G polymorphisms, may be used as a predictive marker of response to BCG immunotherapy.
Resumo:
The process of immobilization of biological molecules is one of the most important steps in the construction of a biosensor. In the case of DNA, the way it exposes its bases can result in electrochemical signals to acceptable levels. The use of self-assembled monolayer that allows a connection to the gold thiol group and DNA binding to an aldehydic ligand resulted in the possibility of determining DNA hybridization. Immobilized single strand of DNA (ssDNA) from calf thymus pre-formed from alkanethiol film was formed by incubating a solution of 2-aminoethanothiol (Cys) followed by glutaraldehyde (Glu). Cyclic voltammetry (CV) was used to characterize the self-assembled monolayer on the gold electrode and, also, to study the immobilization of ssDNA probe and hybridization with the complementary sequence (target ssDNA). The ssDNA probe presents a well-defined oxidation peak at +0.158 V. When the hybridization occurs, this peak disappears which confirms the efficacy of the annealing and the DNA double helix performing without the presence of electroactive indicators. The use of SAM resulted in a stable immobilization of the ssDNA probe, enabling the hybridization detection without labels. This study represents a promising approach for molecular biosensor with sensible and reproducible results.
Resumo:
Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.
Resumo:
The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.
Resumo:
The single-lap joint is the most commonly used, although it endures significant bending due to the non-collinear load path, which negatively affects its load bearing capabilities. The use of material or geometric changes is widely documented in the literature to reduce this handicap, acting by reduction of peel and shear peak stresses or alterations of the failure mechanism emerging from local modifications. In this work, the effect of using different thickness adherends on the tensile strength of single-lap joints, bonded with a ductile and brittle adhesive, was numerically and experimentally evaluated. The joints were tested under tension for different combinations of adherend thickness. The effect of the adherends thickness mismatch on the stress distributions was also investigated by Finite Elements (FE), which explained the experimental results and the strength prediction of the joints. The numerical study was made by FE and Cohesive Zone Modelling (CZM), which allowed characterizing the entire fracture process. For this purpose, a FE analysis was performed in ABAQUS® considering geometric non-linearities. In the end, a detailed comparative evaluation of unbalanced joints, commonly used in engineering applications, is presented to give an understanding on how modifications in the bonded structures thickness can influence the joint performance.
Resumo:
High-content analysis has revolutionized cancer drug discovery by identifying substances that alter the phenotype of a cell, which prevents tumor growth and metastasis. The high-resolution biofluorescence images from assays allow precise quantitative measures enabling the distinction of small molecules of a host cell from a tumor. In this work, we are particularly interested in the application of deep neural networks (DNNs), a cutting-edge machine learning method, to the classification of compounds in chemical mechanisms of action (MOAs). Compound classification has been performed using image-based profiling methods sometimes combined with feature reduction methods such as principal component analysis or factor analysis. In this article, we map the input features of each cell to a particular MOA class without using any treatment-level profiles or feature reduction methods. To the best of our knowledge, this is the first application of DNN in this domain, leveraging single-cell information. Furthermore, we use deep transfer learning (DTL) to alleviate the intensive and computational demanding effort of searching the huge parameter's space of a DNN. Results show that using this approach, we obtain a 30% speedup and a 2% accuracy improvement.
Resumo:
With the need to find an alternative way to mechanical and welding joints, and at the same time to overcome some limitations linked to these traditional techniques, adhesive bonds can be used. Adhesive bonding is a permanent joining process that uses an adhesive to bond the components of a structure. Composite materials reinforced with fibres are becoming increasingly popular in many applications as a result of a number of competitive advantages. In the manufacture of composite structures, although the fabrication techniques reduce to the minimum by means of advanced manufacturing techniques, the use of connections is still required due to the typical size limitations and design, technological and logistical aspects. Moreover, it is known that in many high performance structures, unions between composite materials with other light metals such as aluminium are required, for purposes of structural optimization. This work deals with the experimental and numerical study of single lap joints (SLJ), bonded with a brittle (Nagase Chemtex Denatite XNRH6823) and a ductile adhesive (Nagase Chemtex Denatite XNR6852). These are applied to hybrid joints between aluminium (AL6082-T651) and carbon fibre reinforced plastic (CFRP; Texipreg HS 160 RM) adherends in joints with different overlap lengths (LO) under a tensile loading. The Finite Element (FE) Method is used to perform detailed stress and damage analyses allowing to explain the joints’ behaviour and the use of cohesive zone models (CZM) enables predicting the joint strength and creating a simple and rapid design methodology. The use of numerical methods to simulate the behaviour of the joints can lead to savings of time and resources by optimizing the geometry and material parameters of the joints. The joints’ strength and failure modes were highly dependent on the adhesive, and this behaviour was successfully modelled numerically. Using a brittle adhesive resulted in a negligible maximum load (Pm) improvement with LO. The joints bonded with the ductile adhesive showed a nearly linear improvement of Pm with LO.
Resumo:
We have developed a new method for single-drop microextraction (SDME) for the preconcentration of organochlorine pesticides (OCP) from complex matrices. It is based on the use of a silicone ring at the tip of the syringe. A 5 μL drop of n-hexane is applied to an aqueous extract containing the OCP and found to be adequate to preconcentrate the OCPs prior to analysis by GC in combination with tandem mass spectrometry. Fourteen OCP were determined using this technique in combination with programmable temperature vaporization. It is shown to have many advantages over traditional split/splitless injection. The effects of kind of organic solvent, exposure time, agitation and organic drop volume were optimized. Relative recoveries range from 59 to 117 %, with repeatabilities of <15 % (coefficient of variation) were achieved. The limits of detection range from 0.002 to 0.150 μg kg−1. The method was applied to the preconcentration of OCPs in fresh strawberry, strawberry jam, and soil.
Resumo:
The electroactivity of butylate (BTL) is studied by cyclic voltammetry (CV) and square wave voltammetry (SWV) at a glassy carbon electrode (GCE) and a hanging mercury drop electrode (HMDE). Britton–Robinson buffer solutions of pH 1.9–11.5 are used as supporting electrolyte. CV voltammograms using GCE show a single anodic peak regarding the oxidation of BTL at +1.7V versus AgCl/ Ag, an irreversible process controlled by diffusion. Using a HMDE, a single cathodic peak is observed, at 1.0V versus AgCl/Ag. The reduction of BTL is irreversible and controlled by adsorption. Mechanism proposals are presented for these redox transformations. Optimisation is carried out univaryingly. Linearity ranges were 0.10–0.50 mmol L-1 and 2.0–9.0 µmolL-1 for anodic and cathodic peaks, respectively. The proposed method is applied to the determination of BTL in waters. Analytical results compare well with those obtained by an HPLC method.
Resumo:
Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.
Resumo:
In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.
Resumo:
LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.
Resumo:
Obesity and type 2 diabetes mellitus (T2D) are two major public health problems that have motivated the scientific community to investigate the high contribution of genetic factors to these disorders. The peroxisome proliferator activated by gamma 2 (PPARy2) plays an important role in the lipid metabolism. Since PPARy2 is expressed mainly in adipose tissue, a moderate reduction of its activity influences the sensitivity to insulin, diabetes, and other metabolic parameters. The present study aims to contribute to the elucidation of the impact of the Pro12Ala polymorphism associated with T2D and obesity through a meta-analysis study of the literature that included approximately 11500 individuals, from which 3870 were obese and 7625 were diabetic. Statistical evidence supports protective effect in T2D of polymorphism Pro12Ala of PPARy2 (OR = 0.702 with 95% CI: 0.622; 0.791, P<0.01). Conversely the same polymorphism Pro12Ala of PPARy2 seems to favor obesity since 1.196 more chance than nonobese was found (OR = 1.196 with 95% CI: 1.009; 1.417,P<0.004). Our results suggest that Pro12Ala polymorphism enhances both adipogenic and antidiabetogenic physiological role of PPARy. Does Pro12Ala polymorphism represent an evolutionary step towards the stabilization of the molecular function of PPARy transcription factor signaling pathway?
Resumo:
Graphics processors were originally developed for rendering graphics but have recently evolved towards being an architecture for general-purpose computations. They are also expected to become important parts of embedded systems hardware -- not just for graphics. However, this necessitates the development of appropriate timing analysis techniques which would be required because techniques developed for CPU scheduling are not applicable. The reason is that we are not interested in how long it takes for any given GPU thread to complete, but rather how long it takes for all of them to complete. We therefore develop a simple method for finding an upper bound on the makespan of a group of GPU threads executing the same program and competing for the resources of a single streaming multiprocessor (whose architecture is based on NVIDIA Fermi, with some simplifying assunptions). We then build upon this method to formulate the derivation of the exact worst-case makespan (and corresponding schedule) as an optimization problem. Addressing the issue of tractability, we also present a technique for efficiently computing a safe estimate of the worstcase makespan with minimal pessimism, which may be used when finding an exact value would take too long.
Resumo:
LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, its characteristics upon multiprocessor platforms have been little studied until now. Orthogonally, it has remained open how to efficiently schedule general task systems, including constrained deadline task systems, upon multiprocessors. Recent studies have introduced zero laxity (ZL) policy, which assigns a higher priority to a task with zero laxity, as a promising scheduling approach for such systems (e.g., EDZL). Towards understanding the importance of laxity in multiprocessor scheduling, this paper investigates the characteristics of ZL policy and presents the first ZL schedulability test for any work-conserving scheduling algorithm that employs this policy. It then investigates the characteristics of LLF scheduling, which also employs the ZL policy, and derives the first LLF-specific schedulability test on multiprocessors. It is shown that the proposed LLF test dominates the ZL test as well as the state-of-art EDZL test.