912 resultados para fundamental principles and applications
Resumo:
Since its discovery, radioactivity has brought numerous benefits to human societies. It has many applications in medicine, serving as a tool for non-invasive methods for diagnosis and therapies against diseases such as cancer. It also applies to technologies for energy in nuclear power plants with relatively low impacts on terms of perfect security. All applications, however, have risks, requiring maximum caution to drive processes and operations involving radioactive elements because, once released into the environment, they have extremely harmful effects on organisms affected. This paper presents fundamental concepts and principles of nuclear physics in order to understand the effects of radioactive elements released into the environment, culminating on the issue of radioactive contamination. Literature review allowed us to understand the radioactive contamination problem on living beings. Three major nuclear accidents have happened in the last thirty years, two of them in consecutive years. The nuclear accident at Chernobyl, Ukraine, in 1986, polluted large areas, condemning hundreds of thousands of people to live with consequences of the accident and effects of radiation, killing thousands of people throughout the years. In 1987, a major radiological accident occurred in Goiania (GO) when a source of radioactive cesium was violated, leading to the death of those who had direct or indirect contact with cesium. The most recent accident, in March, 2011, was located at the nuclear power plant in Fukushima Prefecture, Japan, after an earthquake and tsunami hit the region. There is no extensive and accurate knowledge about the consequences of the contamination entailed in that accident, although it is possible to verify signals on a global scale. An analysis of reports of contamination of large areas generated by nuclear plants with release of hazardous wastes suggests it is necessary to rethink the energy matrix of the various countries...
Resumo:
The respiration of metal oxides by the bacterium Geobacter sulfurreducens requires the assembly of a small peptide (the GS pilin) into conductive filaments termed pili. We gained insights into the contribution of the GS pilin to the pilus conductivity by developing a homology model and performing molecular dynamics simulations of the pilin peptide in vacuo and in solution. The results were consistent with a predominantly helical peptide containing the conserved a-helix region required for pilin assembly but carrying a short carboxy-terminal random-coiled segment rather than the large globular head of other bacterial pilins. The electronic structure of the pain was also explored from first principles and revealed a biphasic charge distribution along the pilin and a low electronic HOMO-LUMO gap, even in a wet environment. The low electronic band gap was the result of strong electrostatic fields generated by the alignment of the peptide bond dipoles in the pilin's alpha-helix and by charges from ions in solution and amino acids in the protein. The electronic structure also revealed some level of orbital delocalization in regions of the pilin containing aromatic amino acids and in spatial regions of high resonance where the HOMO and LUMO states are, which could provide an optimal environment for the hopping of electrons under thermal fluctuations. Hence, the structural and electronic features of the pilin revealed in these studies support the notion of a pilin peptide environment optimized for electron conduction.
Resumo:
Turbulence is one of the key problems of classical physics, and it has been the object of intense research in the last decades in a large spectrum of problems involving fluids, plasmas, and waves. In order to review some advances in theoretical and experimental investigations on turbulence a mini-symposium on this subject was organized in the Dynamics Days South America 2010 Conference. The main goal of this mini-symposium was to present recent developments in both fundamental aspects and dynamical analysis of turbulence in nonlinear waves and fusion plasmas. In this paper we present a summary of the works presented at this mini-symposium. Among the questions to be addressed were the onset and control of turbulence and spatio-temporal chaos. (C) 2011 Elsevier B. V. All rights reserved.
Resumo:
We study isoparametric submanifolds of rank at least two in a separable Hilbert space, which are known to be homogeneous by the main result in [E. Heintze and X. Liu, Ann. of Math. (2), 149 (1999), 149-181], and with such a submanifold M and a point x in M we associate a canonical homogeneous structure I" (x) (a certain bilinear map defined on a subspace of T (x) M x T (x) M). We prove that I" (x) , together with the second fundamental form alpha (x) , encodes all the information about M, and we deduce from this the rigidity result that M is completely determined by alpha (x) and (Delta alpha) (x) , thereby making such submanifolds accessible to classification. As an essential step, we show that the one-parameter groups of isometries constructed in [E. Heintze and X. Liu, Ann. of Math. (2), 149 (1999), 149-181] to prove their homogeneity induce smooth and hence everywhere defined Killing fields, implying the continuity of I" (this result also seems to close a gap in [U. Christ, J. Differential Geom., 62 (2002), 1-15]). Here an important tool is the introduction of affine root systems of isoparametric submanifolds.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
Bread dough and particularly wheat dough, due to its viscoelastic behaviour, is probably the most dynamic and complicated rheological system and its characteristics are very important since they highly affect final products’ textural and sensorial properties. The study of dough rheology has been a very challenging task for many researchers since it can provide numerous information about dough formulation, structure and processing. This explains why dough rheology has been a matter of investigation for several decades. In this research rheological assessment of doughs and breads was performed by using empirical and fundamental methods at both small and large deformation, in order to characterize different types of doughs and final products such as bread. In order to study the structural aspects of food products, image analysis techniques was used for the integration of the information coming from empirical and fundamental rheological measurements. Evaluation of dough properties was carried out by texture profile analysis (TPA), dough stickiness (Chen and Hoseney cell) and uniaxial extensibility determination (Kieffer test) by using a Texture Analyser; small deformation rheological measurements, were performed on a controlled stress–strain rheometer; moreover the structure of different doughs was observed by using the image analysis; while bread characteristics were studied by using texture profile analysis (TPA) and image analysis. The objective of this research was to understand if the different rheological measurements were able to characterize and differentiate the different samples analysed. This in order to investigate the effect of different formulation and processing conditions on dough and final product from a structural point of view. For this aim the following different materials were performed and analysed: - frozen dough realized without yeast; - frozen dough and bread made with frozen dough; - doughs obtained by using different fermentation method; - doughs made by Kamut® flour; - dough and bread realized with the addition of ginger powder; - final products coming from different bakeries. The influence of sub-zero storage time on non-fermented and fermented dough viscoelastic performance and on final product (bread) was evaluated by using small deformation and large deformation methods. In general, the longer the sub-zero storage time the lower the positive viscoelastic attributes. The effect of fermentation time and of different type of fermentation (straight-dough method; sponge-and-dough procedure and poolish method) on rheological properties of doughs were investigated using empirical and fundamental analysis and image analysis was used to integrate this information throughout the evaluation of the dough’s structure. The results of fundamental rheological test showed that the incorporation of sourdough (poolish method) provoked changes that were different from those seen in the others type of fermentation. The affirmative action of some ingredients (extra-virgin olive oil and a liposomic lecithin emulsifier) to improve rheological characteristics of Kamut® dough has been confirmed also when subjected to low temperatures (24 hours and 48 hours at 4°C). Small deformation oscillatory measurements and large deformation mechanical tests performed provided useful information on the rheological properties of samples realized by using different amounts of ginger powder, showing that the sample with the highest amount of ginger powder (6%) had worse rheological characteristics compared to the other samples. Moisture content, specific volume, texture and crumb grain characteristics are the major quality attributes of bread products. The different sample analyzed, “Coppia Ferrarese”, “Pane Comune Romagnolo” and “Filone Terra di San Marino”, showed a decrease of crumb moisture and an increase in hardness over the storage time. Parameters such as cohesiveness and springiness, evaluated by TPA that are indicator of quality of fresh bread, decreased during the storage. By using empirical rheological tests we found several differences among the samples, due to the different ingredients used in formulation and the different process adopted to prepare the sample, but since these products are handmade, the differences could be account as a surplus value. In conclusion small deformation (in fundamental units) and large deformation methods showed a significant role in monitoring the influence of different ingredients used in formulation, different processing and storage conditions on dough viscoelastic performance and on final product. Finally the knowledge of formulation, processing and storage conditions together with the evaluation of structural and rheological characteristics is fundamental for the study of complex matrices like bakery products, where numerous variable can influence their final quality (e.g. raw material, bread-making procedure, time and temperature of the fermentation and baking).
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
Efficient coupling of light to quantum emitters, such as atoms, molecules or quantum dots, is one of the great challenges in current research. The interaction can be strongly enhanced by coupling the emitter to the eva-nescent field of subwavelength dielectric waveguides that offer strong lateral confinement of the guided light. In this context subwavelength diameter optical nanofibers as part of a tapered optical fiber (TOF) have proven to be powerful tool which also provide an efficient transfer of the light from the interaction region to an optical bus, that is to say, from the nanofiber to an optical fiber. rnAnother approach towards enhancing light–matter interaction is to employ an optical resonator in which the light is circulating and thus passes the emitters many times. Here, both approaches are combined by experi-mentally realizing a microresonator with an integrated nanofiber waist. This is achieved by building a fiber-integrated Fabry-Pérot type resonator from two fiber Bragg grating mirrors with a stop-band near the cesium D2-line wavelength. The characteristics of this resonator fulfill the requirements of nonlinear optics, optical sensing, and cavity quantum electrodynamics in the strong-coupling regime. Together with its advantageous features, such as a constant high coupling strength over a large volume, tunability, high transmission outside the mirror stop band, and a monolithic design, this resonator is a promising tool for experiments with nanofiber-coupled atomic ensembles in the strong-coupling regime. rnThe resonator's high sensitivity to the optical properties of the nanofiber provides a probe for changes of phys-ical parameters that affect the guided optical mode, e.g., the temperature via the thermo-optic effect of silica. Utilizing this detection scheme, the thermalization dynamics due to far-field heat radiation of a nanofiber is studied over a large temperature range. This investigation provides, for the first time, a measurement of the total radiated power of an object with a diameter smaller than all absorption lengths in the thermal spectrum at the level of a single object of deterministic shape and material. The results show excellent agreement with an ab initio thermodynamic model that considers heat radiation as a volumetric effect and that takes the emitter shape and size relative to the emission wavelength into account. Modeling and investigating the thermalization of microscopic objects with arbitrary shape from first principles is of fundamental interest and has important applications, such as heat management in nano-devices or radiative forcing of aerosols in Earth's climate system. rnUsing a similar method, the effect of the TOF's mechanical modes on the polarization and phase of the fiber-guided light is studied. The measurement results show that in typical TOFs these quantities exhibit high-frequency thermal fluctuations. They originate from high-Q torsional oscillations that couple to the nanofiber-guided light via the strain-optic effect. An ab-initio opto-mechanical model of the TOF is developed that provides an accurate quantitative prediction for the mode spectrum and the mechanically induced polarization and phase fluctuations. These high-frequency fluctuations may limit the ultimate ideality of fiber-coupling into photonic structures. Furthermore, first estimations show that they may currently limit the storage time of nanofiber-based atom traps. The model, on the other hand, provides a method to design TOFs with tailored mechanical properties in order to meet experimental requirements. rn
Resumo:
Over the last decade, translational science has come into the focus of academic medicine, and significant intellectual and financial efforts have been made to initiate a multitude of bench-to-bedside projects. The quest for suitable biomarkers that will significantly change clinical practice has become one of the biggest challenges in translational medicine. Quantitative measurement of proteins is a critical step in biomarker discovery. Assessing a large number of potential protein biomarkers in a statistically significant number of samples and controls still constitutes a major technical hurdle. Multiplexed analysis offers significant advantages regarding time, reagent cost, sample requirements and the amount of data that can be generated. The two contemporary approaches in multiplexed and quantitative biomarker validation, antibody-based immunoassays and MS-based multiple (or selected) reaction monitoring, are based on different assay principles and instrument requirements. Both approaches have their own advantages and disadvantages and therefore have complementary roles in the multi-staged biomarker verification and validation process. In this review, we discuss quantitative immunoassay and multiple reaction monitoring/selected reaction monitoring assay principles and development. We also discuss choosing an appropriate platform, judging the performance of assays, obtaining reliable, quantitative results for translational research and clinical applications in the biomarker field.
Resumo:
During the past two decades, chiral capillary electrophoresis (CE) emerged as a promising, effective and economic approach for the enantioselective determination of drugs and their metabolites in body fluids, tissues and in vitro preparations. This review discusses the principles and important aspects of CE-based chiral bioassays, provides a survey of the assays developed during the past 10 years and presents an overview of the key achievements encountered in that time period. Applications discussed encompass the pharmacokinetics of drug enantiomers in vivo and in vitro, the elucidation of the stereoselectivity of drug metabolism in vivo and in vitro, and bioanalysis of drug enantiomers of toxicological, forensic and doping interest. Chiral CE was extensively employed for research purposes to investigate the stereoselectivity associated with hydroxylation, dealkylation, carboxylation, sulfoxidation, N-oxidation and ketoreduction of drugs and metabolites. Enantioselective CE played a pivotal role in many biomedical studies, thereby providing new insights into the stereoselective metabolism of drugs in different species which might eventually lead to new strategies for optimization of pharmacotherapy in clinical practice.
Resumo:
This paper constitutes a summary of the consensus documents agreed at the First European Workshop on Implant Dentistry University Education held in Prague on 19-22 June 2008. Implant dentistry is becoming increasingly important treatment alternative for the restoration of missing teeth, as patients expectations and demands increase. Furthermore, implant related complications such as peri-implantitis are presenting more frequently in the dental surgery. This consensus paper recommends that implant dentistry should be an integral part of the undergraduate curriculum. Whilst few schools will achieve student competence in the surgical placement of implants this should not preclude the inclusion of the fundamental principles of implant dentistry in the undergraduate curriculum such as the evidence base for their use, indications and contraindications and treatment of the complications that may arise. The consensus paper sets out the rationale for the introduction of implant dentistry in the dental curriculum and the knowledge base for an undergraduate programme in the subject. It lists the competencies that might be sought without expectations of surgical placement of implants at this stage and the assessment methods that might be employed. The paper also addresses the competencies and educational pathways for postgraduate education in implant dentistry.
Resumo:
Telescopic systems of structural members with clearance are found in many applications, e.g., mobile cranes, rack feeders, fork lifters, stacker cranes (see Figure 1). Operating these machines, undesirable vibrations may reduce the performance and increase safety problems. Therefore, this contribution has the aim to reduce these harmful vibrations. For a better understanding, the dynamic behaviour of these constructions is analysed. The main interest is the overlapping area of each two sections of the above described systems (see markings in Figure 1) which is investigated by measurements and by computations. A test rig is constructed to determine the dynamic behaviour by measuring fundamental vibrations and higher frequent oscillations, damping coefficients, special appearances and more. For an appropriate physical model, the governing boundary value problem is derived by applying Hamilton’s principle and a classical discretisation procedure is used to generate a coupled system of nonlinear ordinary differential equations as the corresponding truncated mathematical model. On the basis of this model, a controller concept for preventing harmful vibrations is developed.