945 resultados para Cu-based fcc solid solution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de estágio de mestrado em Ensino de Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in heavy metal contamination in freshwater systems causes serious environmental problems in most industrialized countries, and the effort to find ecofriendly techniques for reducing water and sediment contamination is fundamental for environmental protection. Permeable barriers made of natural clays can be used as low-cost and eco-friendly materials for adsorbing heavy metals from water solution and thus reducing the sediment contamination. This study discusses the application of permeable barriers made of vermiculite clay for heavy metals remediation at the interface between water and sediments and investigates the possibility to increase their efficiency by loading the vermiculite surface with a microbial biofilm of Pseudomonas putida, which is well known to be a heavy metal accumulator. Some batch assays were performed to verify the uptake capacity of two systems and their adsorption kinetics, and the results indicated that the vermiculite bio-barrier system had a higher removal capacity than the vermiculite barrier (?34.4 and 22.8 % for Cu and Zn, respectively). Moreover, the presence of P. putida biofilm strongly contributed to fasten the kinetics of metals adsorption onto vermiculite sheets. In open-system conditions, the presence of a vermiculite barrier at the interface between water and sediment could reduce the sediment contamination up to 20 and 23 % for Cu and Zn, respectively, highlighting the efficiency of these eco-friendly materials for environmental applications. Nevertheless, the contribution of microbial biofilm in open-system setup should be optimized, and some important considerations about biofilm attachment in a continuous-flow system have been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The distally based anterolateral thigh (ALT) flap is an interesting reconstructive solution for complex soft tissue defects of the knee. In spite of a low donor site morbidity and wide covering surface as well as arch of rotation, it has never gained popularity among reconstructive surgeons. Venous congestion and difficult flap dissection in the presence of a variable anatomy of the vascular pedicle are the possible reasons.Methods An anatomical study of 15 cadaver legs was performed to further clarify the blood supply of the distally based ALT. Our early experience with the use of preoperative angiography and a safe flap design modification that avoids distal intramuscular skeletonization of the vascular pedicle and includes a subcutaneous strip ranging from the distal end of the flap to the pivot point is presented.Results The distally based ALT presents a constant and reliable retrograde vascular contribution from the superior genicular artery. Preoperative angiography reliably identified and avoided critical Shieh Type II pedicled flaps. The preservation of a subcutaneous strip ranging from the distal flap end to the upper knee was associated with the absence of venous congestion in a short case series.Conclusions Preoperative angiography and a flap design modification are proposed to allow the safe transfer of the distally based ALT to reconstruct soft tissue defects of the knee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT : Gene duplication is a fundamental source of raw material for the origin of genetic novelty. It has been assumed for a long time that DNA-based gene duplication was the only source of new genes. Recently however, RNA-based gene duplication (retroposition) was shown in multiple organisms to contribute significantly to their genetic diversity. This mechanism produces intronless gene copies (retrocopies) that are inserted in random genomic position, independent of the position of the parental source genes. In human, mouse and fruit fly, it was demonstrated that the X-linked genes spawned an excess of functional retroposed gene copies (retrogenes). In human and mouse, the X chromosome also recruited an excess of retrogenes. Here we further characterized these interesting biases related to the X chromosome in mammals. Firstly, we have confirmed presence of the aforementioned biases in dog and opossum genome. Then based on the expression profile of retrogenes during various spermatogenetic stages, we have provided solid evidence that meiotic sex chromosome inactivation (MSCI) is responsible for an excess of retrogenes stemming from the X chromosome. Moreover, we showed that the X-linked genes started to export an excess of retrogenes just after the split of eutherian and marsupial mammalian lineages. This suggests that MSCI has originated around this time as well. More fundamentally, as MSCI reflects the spread of recombination barrier between the X and Y chromosomes during their evolution, our observation allowed us to re-estimate the age of mammalian sex chromosomes. Previous estimates suggested that they emerged in the common ancestor of all mammals (before the split of monotreme lineage); whereas, here we showed that they originated around the split of marsupial and eutherian lineages, after the divergence of monotremes. Thus, the therian (marsupial and eutherian) sex chromosomes are younger than previously thought. Thereafter, we have characterized the bias related to the recruitment of genes to the X chromosome. Sexually antagonistic forces are most likely driving this pattern. Using our limited retrogenes expression data, it is difficult to determine the exact nature of these forces but some conclusions have been made. Lastly, we looked at the history of this biased recruitment: it commenced around the split of marsupial and eutherian lineages (akin to the biased export of genes out of the X). In fact, the sexually antagonistic forces are predicted to appear just around that time as well. Thereby, the history of the recruitment of genes to the X, provides an indirect evidence that these forces are responsible for this bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Imaging mass spectrometry (IMS) is an emergent and innovative approach for measuring the composition, abundance and regioselectivity of molecules within an investigated area of fixed dimension. Although providing unprecedented molecular information compared with conventional MS techniques, enhancement of protein signature by IMS is still necessary and challenging. This paper demonstrates the combination of conventional organic washes with an optimized aqueous-based buffer for tissue section preparation before matrix-assisted laser desorption/ionization (MALDI) IMS of proteins. Based on a 500 mM ammonium formate in water-acetonitrile (9:1; v/v, 0.1% trifluororacetic acid, 0.1% Triton) solution, this buffer wash has shown to significantly enhance protein signature by profiling and IMS (~fourfold) when used after organic washes (70% EtOH followed by 90% EtOH), improving the quality and number of ion images obtained from mouse kidney and a 14-day mouse fetus whole-body tissue sections, while maintaining a similar reproducibility with conventional tissue rinsing. Even if some protein losses were observed, the data mining has demonstrated that it was primarily low abundant signals and that the number of new peaks found is greater with the described procedure. The proposed buffer has thus demonstrated to be of high efficiency for tissue section preparation providing novel and complementary information for direct on-tissue MALDI analysis compared with solely conventional organic rinsing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaction of glucose/mannose-binding lectins in solution with immobilized glycoproteins was followed in real time using surface plasmon resonance technology. The lectins which share many biochemical and structural features could be clearly differentiated in terms of their specificity for complex glycoconjugates. The most prominent interaction of the lectins with PHA-E comparing with soybean agglutinin, both glycoproteins exhibiting high mannose oligosaccharides, suggests that the whole structure of the glycoproteins themselves, may interfere in affinity. These findings also support the hypothesis that minor amino acid replacements in the primary sequence of the lectins might be responsible for their divergence in fine specificity and biological activities. This is the first report using surface plasmon resonance technology that evidences differences of Diocleinae lectins in respect their fine glycan-specificity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immunodetection of human IgG anti-Toxocara canis was developed based on ELISA and on the use of polysiloxane/polyvinyl alcohol (POS/PVA) beads. A recombinant antigen was covalently immobilized, via glutaraldehyde, onto this hybrid inorganic-organic composite, which was prepared by the sol-gel technique. Using only 31.2 ng antigen per bead, a peroxidase conjugate dilution of 1:10,000 and a serum dilution of 1:200 were adequate for the establishment of the procedure. This procedure is comparable to that which utilizes the adsorption of the antigen to conventional PVC plates. However, the difference between positive and negative sera mean absorbances was larger for this new glass based assay. In addition to the performance of the POS/PVA bead as a matrix for immunodetection, its easy synthesis and low cost are additional advantages for commercial application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new and original reagent based on the use of highly fluorescent cadmium telluride (CdTe) quantum dots (QDs) in aqueous solution is proposed to detect weak fingermarks in blood on non-porous surfaces. To assess the efficiency of this approach, comparisons were performed with one of the most efficient blood reagents on non-porous surfaces, Acid Yellow 7 (AY7). To this end, four non-porous surfaces were studied, i.e. glass, transparent polypropylene, black polyethylene, and aluminium foil. To evaluate the sensitivity of both reagents, sets of depleted fingermarks were prepared, using the same finger, initially soaked with blood, which was then successively applied on the same surface without recharging it with blood or latent secretions. The successive marks were then cut in halves and the halves treated separately with each reagent. The results showed that QDs were equally efficient to AY7 on glass, polyethylene and polypropylene surfaces, and were superior to AY7 on aluminium. The use of QDs in new, sensitive and highly efficient latent and blood mark detection techniques appears highly promising. Health and safety issues related to the use of cadmium are also discussed. It is suggested that applying QDs in aqueous solution (and not as a dry dusting powder) considerably lowers the toxicity risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The only long-term and cost-effective solution to the human immunodeficiency virus (HIV) epidemic in the developing world is a vaccine that prevents individuals from becoming infected or, once infected, from passing the virus on to others. There is currently little hope for an AIDS vaccine. Conventional attempts to induce protective antibody and CD8+ lymphocyte responses against HIV and simian immunodeficiency virus (SIV) have failed. The enormous diversity of the virus has only recently been appreciated by vaccinologists, and our assays to determine CD8+ lymphocyte antiviral efficacy are inadequate. The central hypothesis of a CTL-based vaccine is that particularly effective CD8+ lymphocytes directed against at least five epitopes that are derived from regions under functional and structural constraints will control replication of pathogenic SIV. This would be somewhat analogous to control of virus replication by triple drug therapy or neutralizing antibodies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Positioning a robot with respect to objects by using data provided by a camera is a well known technique called visual servoing. In order to perform a task, the object must exhibit visual features which can be extracted from different points of view. Then, visual servoing is object-dependent as it depends on the object appearance. Therefore, performing the positioning task is not possible in presence of non-textured objects or objects for which extracting visual features is too complex or too costly. This paper proposes a solution to tackle this limitation inherent to the current visual servoing techniques. Our proposal is based on the coded structured light approach as a reliable and fast way to solve the correspondence problem. In this case, a coded light pattern is projected providing robust visual features independently of the object appearance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The in situ deposition of zinc oxide on gold nanoparticles in aqueous solution has been here successfully applied in the field of fingermark detection on various non-porous surfaces. In this article, we present the improvement of the multimetal deposition, an existing technique limited up to now to non-luminescent results, by obtaining luminescent fingermarks with very good contrast and details. This is seen as a major improvement in the field in terms of selectivity and sensitivity of detection, especially on black surfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).