979 resultados para non-thermal technologies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safe use of nuclear power plants (NPPs) requires a deep understanding of the functioning of physical processes and systems involved. Studies on thermal hydraulics have been carried out in various separate effects and integral test facilities at Lappeenranta University of Technology (LUT) either to ensure the functioning of safety systems of light water reactors (LWR) or to produce validation data for the computer codes used in safety analyses of NPPs. Several examples of safety studies on thermal hydraulics of the nuclear power plants are discussed. Studies are related to the physical phenomena existing in different processes in NPPs, such as rewetting of the fuel rods, emergency core cooling (ECC), natural circulation, small break loss-of-coolant accidents (SBLOCA), non-condensable gas release and transport, and passive safety systems. Studies on both VVER and advanced light water reactor (ALWR) systems are included. The set of cases include separate effects tests for understanding and modeling a single physical phenomenon, separate effects tests to study the behavior of a NPP component or a single system, and integral tests to study the behavior of the whole system. In the studies following steps can be found, not necessarily in the same study. Experimental studies as such have provided solutions to existing design problems. Experimental data have been created to validate a single model in a computer code. Validated models are used in various transient analyses of scaled facilities or NPPs. Integral test data are used to validate the computer codes as whole, to see how the implemented models work together in a code. In the final stage test results from the facilities are transferred to the NPP scale using computer codes. Some of the experiments have confirmed the expected behavior of the system or procedure to be studied; in some experiments there have been certain unexpected phenomena that have caused changes to the original design to avoid the recognized problems. This is the main motivation for experimental studies on thermal hydraulics of the NPP safety systems. Naturally the behavior of the new system designs have to be checked with experiments, but also the existing designs, if they are applied in the conditions that differ from what they were originally designed for. New procedures for existing reactors and new safety related systems have been developed for new nuclear power plant concepts. New experiments have been continuously needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Malgré son importance dans notre vie de tous les jours, certaines propriétés de l?eau restent inexpliquées. L'étude des interactions entre l'eau et les particules organiques occupe des groupes de recherche dans le monde entier et est loin d'être finie. Dans mon travail j'ai essayé de comprendre, au niveau moléculaire, ces interactions importantes pour la vie. J'ai utilisé pour cela un modèle simple de l'eau pour décrire des solutions aqueuses de différentes particules. Récemment, l?eau liquide a été décrite comme une structure formée d?un réseau aléatoire de liaisons hydrogènes. En introduisant une particule hydrophobe dans cette structure à basse température, certaines liaisons hydrogènes sont détruites ce qui est énergétiquement défavorable. Les molécules d?eau s?arrangent alors autour de cette particule en formant une cage qui permet de récupérer des liaisons hydrogènes (entre molécules d?eau) encore plus fortes : les particules sont alors solubles dans l?eau. A des températures plus élevées, l?agitation thermique des molécules devient importante et brise les liaisons hydrogènes. Maintenant, la dissolution des particules devient énergétiquement défavorable, et les particules se séparent de l?eau en formant des agrégats qui minimisent leur surface exposée à l?eau. Pourtant, à très haute température, les effets entropiques deviennent tellement forts que les particules se mélangent de nouveau avec les molécules d?eau. En utilisant un modèle basé sur ces changements de structure formée par des liaisons hydrogènes j?ai pu reproduire les phénomènes principaux liés à l?hydrophobicité. J?ai trouvé une région de coexistence de deux phases entre les températures critiques inférieure et supérieure de solubilité, dans laquelle les particules hydrophobes s?agrègent. En dehors de cette région, les particules sont dissoutes dans l?eau. J?ai démontré que l?interaction hydrophobe est décrite par un modèle qui prend uniquement en compte les changements de structure de l?eau liquide en présence d?une particule hydrophobe, plutôt que les interactions directes entre les particules. Encouragée par ces résultats prometteurs, j?ai étudié des solutions aqueuses de particules hydrophobes en présence de co-solvants cosmotropiques et chaotropiques. Ce sont des substances qui stabilisent ou déstabilisent les agrégats de particules hydrophobes. La présence de ces substances peut être incluse dans le modèle en décrivant leur effet sur la structure de l?eau. J?ai pu reproduire la concentration élevée de co-solvants chaotropiques dans le voisinage immédiat de la particule, et l?effet inverse dans le cas de co-solvants cosmotropiques. Ce changement de concentration du co-solvant à proximité de particules hydrophobes est la cause principale de son effet sur la solubilité des particules hydrophobes. J?ai démontré que le modèle adapté prédit correctement les effets implicites des co-solvants sur les interactions de plusieurs corps entre les particules hydrophobes. En outre, j?ai étendu le modèle à la description de particules amphiphiles comme des lipides. J?ai trouvé la formation de différents types de micelles en fonction de la distribution des regions hydrophobes à la surface des particules. L?hydrophobicité reste également un sujet controversé en science des protéines. J?ai défini une nouvelle échelle d?hydrophobicité pour les acides aminés qui forment des protéines, basée sur leurs surfaces exposées à l?eau dans des protéines natives. Cette échelle permet une comparaison meilleure entre les expériences et les résultats théoriques. Ainsi, le modèle développé dans mon travail contribue à mieux comprendre les solutions aqueuses de particules hydrophobes. Je pense que les résultats analytiques et numériques obtenus éclaircissent en partie les processus physiques qui sont à la base de l?interaction hydrophobe.<br/><br/>Despite the importance of water in our daily lives, some of its properties remain unexplained. Indeed, the interactions of water with organic particles are investigated in research groups all over the world, but controversy still surrounds many aspects of their description. In my work I have tried to understand these interactions on a molecular level using both analytical and numerical methods. Recent investigations describe liquid water as random network formed by hydrogen bonds. The insertion of a hydrophobic particle at low temperature breaks some of the hydrogen bonds, which is energetically unfavorable. The water molecules, however, rearrange in a cage-like structure around the solute particle. Even stronger hydrogen bonds are formed between water molecules, and thus the solute particles are soluble. At higher temperatures, this strict ordering is disrupted by thermal movements, and the solution of particles becomes unfavorable. They minimize their exposed surface to water by aggregating. At even higher temperatures, entropy effects become dominant and water and solute particles mix again. Using a model based on these changes in water structure I have reproduced the essential phenomena connected to hydrophobicity. These include an upper and a lower critical solution temperature, which define temperature and density ranges in which aggregation occurs. Outside of this region the solute particles are soluble in water. Because I was able to demonstrate that the simple mixture model contains implicitly many-body interactions between the solute molecules, I feel that the study contributes to an important advance in the qualitative understanding of the hydrophobic effect. I have also studied the aggregation of hydrophobic particles in aqueous solutions in the presence of cosolvents. Here I have demonstrated that the important features of the destabilizing effect of chaotropic cosolvents on hydrophobic aggregates may be described within the same two-state model, with adaptations to focus on the ability of such substances to alter the structure of water. The relevant phenomena include a significant enhancement of the solubility of non-polar solute particles and preferential binding of chaotropic substances to solute molecules. In a similar fashion, I have analyzed the stabilizing effect of kosmotropic cosolvents in these solutions. Including the ability of kosmotropic substances to enhance the structure of liquid water, leads to reduced solubility, larger aggregation regime and the preferential exclusion of the cosolvent from the hydration shell of hydrophobic solute particles. I have further adapted the MLG model to include the solvation of amphiphilic solute particles in water, by allowing different distributions of hydrophobic regions at the molecular surface, I have found aggregation of the amphiphiles, and formation of various types of micelle as a function of the hydrophobicity pattern. I have demonstrated that certain features of micelle formation may be reproduced by the adapted model to describe alterations of water structure near different surface regions of the dissolved amphiphiles. Hydrophobicity remains a controversial quantity also in protein science. Based on the surface exposure of the 20 amino-acids in native proteins I have defined the a new hydrophobicity scale, which may lead to an improvement in the comparison of experimental data with the results from theoretical HP models. Overall, I have shown that the primary features of the hydrophobic interaction in aqueous solutions may be captured within a model which focuses on alterations in water structure around non-polar solute particles. The results obtained within this model may illuminate the processes underlying the hydrophobic interaction.<br/><br/>La vie sur notre planète a commencé dans l'eau et ne pourrait pas exister en son absence : les cellules des animaux et des plantes contiennent jusqu'à 95% d'eau. Malgré son importance dans notre vie de tous les jours, certaines propriétés de l?eau restent inexpliquées. En particulier, l'étude des interactions entre l'eau et les particules organiques occupe des groupes de recherche dans le monde entier et est loin d'être finie. Dans mon travail j'ai essayé de comprendre, au niveau moléculaire, ces interactions importantes pour la vie. J'ai utilisé pour cela un modèle simple de l'eau pour décrire des solutions aqueuses de différentes particules. Bien que l?eau soit généralement un bon solvant, un grand groupe de molécules, appelées molécules hydrophobes (du grecque "hydro"="eau" et "phobia"="peur"), n'est pas facilement soluble dans l'eau. Ces particules hydrophobes essayent d'éviter le contact avec l'eau, et forment donc un agrégat pour minimiser leur surface exposée à l'eau. Cette force entre les particules est appelée interaction hydrophobe, et les mécanismes physiques qui conduisent à ces interactions ne sont pas bien compris à l'heure actuelle. Dans mon étude j'ai décrit l'effet des particules hydrophobes sur l'eau liquide. L'objectif était d'éclaircir le mécanisme de l'interaction hydrophobe qui est fondamentale pour la formation des membranes et le fonctionnement des processus biologiques dans notre corps. Récemment, l'eau liquide a été décrite comme un réseau aléatoire formé par des liaisons hydrogènes. En introduisant une particule hydrophobe dans cette structure, certaines liaisons hydrogènes sont détruites tandis que les molécules d'eau s'arrangent autour de cette particule en formant une cage qui permet de récupérer des liaisons hydrogènes (entre molécules d?eau) encore plus fortes : les particules sont alors solubles dans l'eau. A des températures plus élevées, l?agitation thermique des molécules devient importante et brise la structure de cage autour des particules hydrophobes. Maintenant, la dissolution des particules devient défavorable, et les particules se séparent de l'eau en formant deux phases. A très haute température, les mouvements thermiques dans le système deviennent tellement forts que les particules se mélangent de nouveau avec les molécules d'eau. A l'aide d'un modèle qui décrit le système en termes de restructuration dans l'eau liquide, j'ai réussi à reproduire les phénomènes physiques liés à l?hydrophobicité. J'ai démontré que les interactions hydrophobes entre plusieurs particules peuvent être exprimées dans un modèle qui prend uniquement en compte les liaisons hydrogènes entre les molécules d'eau. Encouragée par ces résultats prometteurs, j'ai inclus dans mon modèle des substances fréquemment utilisées pour stabiliser ou déstabiliser des solutions aqueuses de particules hydrophobes. J'ai réussi à reproduire les effets dûs à la présence de ces substances. De plus, j'ai pu décrire la formation de micelles par des particules amphiphiles comme des lipides dont la surface est partiellement hydrophobe et partiellement hydrophile ("hydro-phile"="aime l'eau"), ainsi que le repliement des protéines dû à l'hydrophobicité, qui garantit le fonctionnement correct des processus biologiques de notre corps. Dans mes études futures je poursuivrai l'étude des solutions aqueuses de différentes particules en utilisant les techniques acquises pendant mon travail de thèse, et en essayant de comprendre les propriétés physiques du liquide le plus important pour notre vie : l'eau.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sol-gel synthesis of bulk silica-based luminescent materials using innocuous hexaethoxydisilane and hexamethoxydisilane monomers, followed by one hour thermal annealing in an inert atmosphere at 950oC-1150oC, is reported. As-synthesized hexamethoxydisilane-derived samples exhibit an intense blue photoluminescence band, whereas thermally treated ones emit stronger photoluminescence radiation peaking below 600 nm. For hexaethoxydisilane-based material, annealed at or above 1000oC, a less intense photoluminescence band, peaking between 780 nm and 850 nm that is attributed to nanocrystalline silicon is observed. Mixtures of both precursors lead to composed spectra, thus envisaging the possibility of obtaining pre-designed spectral behaviors by varying the mixture composition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cirrhosis is the final stage of most of chronic liver diseases, and is almost invariably complicated by portal hypertension, which is the most important cause of morbidity and mortality in these patients. This review will focus on the non-invasive methods currently used in clinical practice for diagnosing liver cirrhosis and portal hypertension. The first-line techniques include physical examination, laboratory parameters, transient elastography and Doppler-US. More sophisticated imaging methods which are less commonly employed are CT scan and MRI, and new technologies which are currently under evaluation are MR elastography and acoustic radiation force imaging (ARFI). Even if none of them can replace the invasive measurement of hepatic venous pressure gradient and the endoscopic screening of gastroesophageal varices, they notably facilitate the clinical management of patients with cirrhosis and portal hypertension, and provide valuable prognostic information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les technosciences (numériques, bio- et nanotechnologies, neurosciences, médecine personnalisée, biologie de synthèse) sont accompagnées de promesses fabuleuses à l'attention du public et des décideurs. L'économie des promesses qui en résulte affecte le régime de financement de la recherche et la gouvernance du changement sociotechnique. Elle crée de l'engouement, soutient la compétition scientifique, attire des ressources financières et légitime d'importantes dépenses publiques. Cet ouvrage met en évidence les cycles accélérés d'enthousiasme et de désillusion, les décalages entre horizons d'attente et les questions démocratiques qu'ils soulèvent. Fondé sur des recherches de terrain relevant de l'étude sociale des sciences et des techniques, de la philosophie et de l'histoire, il examine les formes alternatives d'organisation de la recherche, de participation citoyenne et de répartition des droits de propriété et des bénéfices, et montre qu'une forme de ralentissement des promesses, non des sciences, favoriserait l'articulation de ces dernières avec les besoins de la société. L'ambition des textes réunis ici est d'ouvrir le débat en langue française en interrogeant directement le régime des promesses technoscientifiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contains a joint ESHG/ASHG position document with recommendations regarding responsible innovation in prenatal screening with non-invasive prenatal testing (NIPT). By virtue of its greater accuracy and safety with respect to prenatal screening for common autosomal aneuploidies, NIPT has the potential of helping the practice better achieve its aim of facilitating autonomous reproductive choices, provided that balanced pretest information and non-directive counseling are available as part of the screening offer. Depending on the health-care setting, different scenarios for NIPT-based screening for common autosomal aneuploidies are possible. The trade-offs involved in these scenarios should be assessed in light of the aim of screening, the balance of benefits and burdens for pregnant women and their partners and considerations of cost-effectiveness and justice. With improving screening technologies and decreasing costs of sequencing and analysis, it will become possible in the near future to significantly expand the scope of prenatal screening beyond common autosomal aneuploidies. Commercial providers have already begun expanding their tests to include sex-chromosomal abnormalities and microdeletions. However, multiple false positives may undermine the main achievement of NIPT in the context of prenatal screening: the significant reduction of the invasive testing rate. This document argues for a cautious expansion of the scope of prenatal screening to serious congenital and childhood disorders, only following sound validation studies and a comprehensive evaluation of all relevant aspects. A further core message of this document is that in countries where prenatal screening is offered as a public health programme, governments and public health authorities should adopt an active role to ensure the responsible innovation of prenatal screening on the basis of ethical principles. Crucial elements are the quality of the screening process as a whole (including non-laboratory aspects such as information and counseling), education of professionals, systematic evaluation of all aspects of prenatal screening, development of better evaluation tools in the light of the aim of the practice, accountability to all stakeholders including children born from screened pregnancies and persons living with the conditions targeted in prenatal screening and promotion of equity of access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Al(C9H6ON)3.2.5H2O was precipitated from the mixture of an aqueous solution of aluminium ion and an acid solution of 8-hydroxyquinoline, by increasing the pH value to 9.5 with ammonia aqueous solution. The TG curves in nitrogen atmosphere present mass losses due to dehydration, partial volatilisation (sublimation plus vaporisation) of the anhydrous compound followed by thermal decomposition with the formation of a mixture of carbonaceous and residues. The relation between sublimation and vaporisation depends on the heating rate used. The non isothermic integral isoconventional methods as linear equations of Ozawa-Flynn-Wall and Kissinger-Akahira-Sunose (KAS) were used to obtain the kinetic parameters from TG and DTA curves, respectively. Despite the fact that both dehydration and volatilisation reactions follow the linearity by using both methods, only for the volatilisation reaction the validity condition, 20<= E/RT<= 50, was verified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One filler often utilized in flexible polyurethane foams is calcium carbonate (CaCO3) because it is non-abrasiveness, non-toxicity and facilitated pigmentation. However, it is observed that the excess of commercial CaCO3 utilized in industry possibly causing permanent deformations and damaging the quality of the final product. The effect of different concentrations of commercial CaCO3, in flexible foams, was studied. Different concentrations of CaCO3 were used for the synthesis of flexible polyurethane foams, which were submitted to morphological and thermal analyses to verify the alterations provoked by the progressive introduction of this filler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unprocessed native starches are structurally too weak and functionally too restricted for application in today's advanced food technologies. Processing is necessary to engender a range of functionality. Naturals or natives starches can be modified by using several methods physical, chemical, enzymatic or combined, according industrial purposes. In this work, native corn starch was hydrolyzed by hydrochloric acid solution and investigated by using thermoanalytical techniques (thermogravimetry - TG, differential thermal analysis - DTA and differential scanning calorimetry - DSC), as well as optical microscopy and X-ray diffractometry. After acid treatment at 30 and 50°C, a decrease of gelatinization enthalpy (ΔHgel) was verified. Optical microscopy and X-ray diffractometry allowed us to verify the granules contorn and rugosity typical of cereal starches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present manuscript represents the completion of a research path carried forward during my doctoral studies in the University of Turku. It contains information regarding my scientific contribution to the field of open quantum systems, accomplished in collaboration with other scientists. The main subject investigated in the thesis is the non-Markovian dynamics of open quantum systems with focus on continuous variable quantum channels, e.g. quantum Brownian motion models. Non-Markovianity is here interpreted as a manifestation of the existence of a flow of information exchanged by the system and environment during the dynamical evolution. While in Markovian systems the flow is unidirectional, i.e. from the system to the environment, in non-Markovian systems there are time windows in which the flow is reversed and the quantum state of the system may regain coherence and correlations previously lost. Signatures of a non-Markovian behavior have been studied in connection with the dynamics of quantum correlations like entanglement or quantum discord. Moreover, in the attempt to recognisee non-Markovianity as a resource for quantum technologies, it is proposed, for the first time, to consider its effects in practical quantum key distribution protocols. It has been proven that security of coherent state protocols can be enhanced using non-Markovian properties of the transmission channels. The thesis is divided in two parts: in the first part I introduce the reader to the world of continuous variable open quantum systems and non-Markovian dynamics. The second part instead consists of a collection of five publications inherent to the topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Swine manure agricultural use is a common practice in Brazil. Their physic-chemical characteristics favor its use as biofertilizer, but the presence of pathogens may become a risk to human health. This research presents a qualitative study of the main alternatives of pig manure disinfection, analyzing efficiency, advantages and limitations of each procedure. The disinfection studies reported in literature are based on the following treatments: alkaline, thermal, biological, chemical, and physical. The greater efficiencies are in thermal treatment (> 4 log: 60 °C), chemical treatment (3 to 4 log: 30mg Cl- L-1; 3 to 4 log: 40 mg O3 L-1) and physical treatment (3 a 4 log: 220 mJ UV radiation cm-2). The biological treatment (anaerobiosis) also promotes the pathogen reduction of swine manure, however with lower efficiency (1 to 2 log). The selection of the treatment should consider: implementation and operation cost, necessity of preliminary treatment, efficiency obtained and destination of the treated manure (agricultural use, water reuse). Brazilian regulation does not have specific guidelines for the microbiological quality of animal production effluents that is very important to be considered due to confined animal feeding operation transformation in the last years in the country.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The made research is focused on possibility of application of non ferrous metals in boiler pressure parts as a substitute of currently used ferrous-base alloys. The main issue was to define resistive ability of some perspective non ferrous metals to chlorine induced corrosion. Experimental study was performed using simultaneous thermal analysis (STA) in the temperature range of 400-700 °C. The chloride induced corrosion was simulated by mixtures of metal samples with potassium chloride treated by synthetic air. The advantage of synergetic effect of non ferrous alloys compare to single metals is shown due to the obtained data from conducted thermal balance tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information gained from the human genome project and improvements in compound synthesizing have increased the number of both therapeutic targets and potential lead compounds. This has evolved a need for better screening techniques to have a capacity to screen number of compound libraries against increasing amount of targets. Radioactivity based assays have been traditionally used in drug screening but the fluorescence based assays have become more popular in high throughput screening (HTS) as they avoid safety and waste problems confronted with radioactivity. In comparison to conventional fluorescence more sensitive detection is obtained with time-resolved luminescence which has increased the popularity of time-resolved fluorescence resonance energy transfer (TR-FRET) based assays. To simplify the current TR-FRET based assay concept the luminometric homogeneous single-label utilizing assay technique, Quenching Resonance Energy Transfer (QRET), was developed. The technique utilizes soluble quencher to quench non-specifically the signal of unbound fraction of lanthanide labeled ligand. One labeling procedure and fewer manipulation steps in the assay concept are saving resources. The QRET technique is suitable for both biochemical and cell-based assays as indicated in four studies:1) ligand screening study of β2 -adrenergic receptor (cell-based), 2) activation study of Gs-/Gi-protein coupled receptors by measuring intracellular concentration of cyclic adenosine monophosphate (cell-based), 3) activation study of G-protein coupled receptors by observing the binding of guanosine-5’-triphosphate (cell membranes), and 4) activation study of small GTP binding protein Ras (biochemical). Signal-to-background ratios were between 2.4 to 10 and coefficient of variation varied from 0.5 to 17% indicating their suitability to HTS use.