922 resultados para closed distribution systems
Resumo:
The great challenges for researchers working in the field of vaccinology are optimizing DNA vaccines for use in humans or large animals and creating effective single-dose vaccines using appropriated controlled delivery systems. Plasmid DNA encoding the heat-shock protein 65 (hsp65) (DNAhsp65) has been shown to induce protective and therapeutic immune responses in a murine model of tuberculosis (TB). Despite the success of naked DNAhsp65-based vaccine to protect mice against TB, it requires multiple doses of high amounts of DNA for effective immunization. In order to optimize this DNA vaccine and simplify the vaccination schedule, we coencapsulated DNAhsp65 and the adjuvant trehalose dimycolate (TDM) into biodegradable poly (DL-lactide-co-glycolide) (PLGA) microspheres for a single dose administration. Moreover, a single-shot prime-boost vaccine formulation based on a mixture of two different PLGA microspheres, presenting faster and slower release of, respectively, DNAhsp65 and the recombinant hsp65 protein was also developed. These formulations were tested in mice as well as in guinea pigs by comparison with the efficacy and toxicity induced by the naked DNA preparation or BCG. The single-shot prime-boost formulation clearly presented good efficacy and diminished lung pathology in both mice and guinea pigs.
Resumo:
Abstract Background a decline in immune and endocrine function occurs with aging. The main purpose of this study was to investigate the impact of long-term endurance training on the immune and endocrine system of elderly men. The possible interaction between these systems was also analysed. Results elderly runners showed a significantly higher T cell proliferative response and IL-2 production than sedentary elderly controls. IL-2 production was similar to that in young adults. Their serum IL-6 levels were significantly lower than their sedentary peers. They also showed significantly lower IL-3 production in comparison to sedentary elderly subjects but similar to the youngs. Anabolic hormone levels did not differ between elderly groups and no clear correlation was found between hormones and cytokine levels. Conclusion highly conditioned elderly men seem to have relatively better preserved immune system than the sedentary elderly men. Long-term endurance training has the potential to decelerate the age-related decline in immune function but not the deterioration in endocrine function.
Resumo:
Abstract Background American cutaneous leishmaniasis (ACL) is a re-emerging disease in the state of São Paulo, Brazil. It is important to understand both the vector and disease distribution to help design control strategies. As an initial step in applying geographic information systems (GIS) and remote sensing (RS) tools to map disease-risk, the objectives of the present work were to: (i) produce a single database of species distributions of the sand fly vectors in the state of São Paulo, (ii) create combined distributional maps of both the incidence of ACL and its sand fly vectors, and (iii) thereby provide individual municipalities with a source of reference material for work carried out in their area. Results A database containing 910 individual records of sand fly occurrence in the state of São Paulo, from 37 different sources, was compiled. These records date from between 1943 to 2009, and describe the presence of at least one of the six incriminated or suspected sand fly vector species in 183/645 (28.4%) municipalities. For the remaining 462 (71.6%) municipalities, we were unable to locate records of any of the six incriminated or suspected sand fly vector species (Nyssomyia intermedia, N. neivai, N. whitmani, Pintomyia fischeri, P. pessoai and Migonemyia migonei). The distribution of each of the six incriminated or suspected vector species of ACL in the state of São Paulo were individually mapped and overlaid on the incidence of ACL for the period 1993 to 1995 and 1998 to 2007. Overall, the maps reveal that the six sand fly vector species analyzed have unique and heterogeneous, although often overlapping, distributions. Several sand fly species - Nyssomyia intermedia and N. neivai - are highly localized, while the other sand fly species - N. whitmani, M. migonei, P. fischeri and P. pessoai - are much more broadly distributed. ACL has been reported in 160/183 (87.4%) of the municipalities with records for at least one of the six incriminated or suspected sand fly vector species, while there are no records of any of these sand fly species in 318/478 (66.5%) municipalities with ACL. Conclusions The maps produced in this work provide basic data on the distribution of the six incriminated or suspected sand fly vectors of ACL in the state of São Paulo, and highlight the complex and geographically heterogeneous pattern of ACL transmission in the region. Further studies are required to clarify the role of each of the six suspected sand fly vector species in different regions of the state of São Paulo, especially in the majority of municipalities where ACL is present but sand fly vectors have not yet been identified.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
The wide use of e-technologies represents a great opportunity for underserved segments of the population, especially with the aim of reintegrating excluded individuals back into society through education. This is particularly true for people with different types of disabilities who may have difficulties while attending traditional on-site learning programs that are typically based on printed learning resources. The creation and provision of accessible e-learning contents may therefore become a key factor in enabling people with different access needs to enjoy quality learning experiences and services. Another e-learning challenge is represented by m-learning (which stands for mobile learning), which is emerging as a consequence of mobile terminals diffusion and provides the opportunity to browse didactical materials everywhere, outside places that are traditionally devoted to education. Both such situations share the need to access materials in limited conditions and collide with the growing use of rich media in didactical contents, which are designed to be enjoyed without any restriction. Nowadays, Web-based teaching makes great use of multimedia technologies, ranging from Flash animations to prerecorded video-lectures. Rich media in e-learning can offer significant potential in enhancing the learning environment, through helping to increase access to education, enhance the learning experience and support multiple learning styles. Moreover, they can often be used to improve the structure of Web-based courses. These highly variegated and structured contents may significantly improve the quality and the effectiveness of educational activities for learners. For example, rich media contents allow us to describe complex concepts and process flows. Audio and video elements may be utilized to add a “human touch” to distance-learning courses. Finally, real lectures may be recorded and distributed to integrate or enrich on line materials. A confirmation of the advantages of these approaches can be seen in the exponential growth of video-lecture availability on the net, due to the ease of recording and delivering activities which take place in a traditional classroom. Furthermore, the wide use of assistive technologies for learners with disabilities injects new life into e-learning systems. E-learning allows distance and flexible educational activities, thus helping disabled learners to access resources which would otherwise present significant barriers for them. For instance, students with visual impairments have difficulties in reading traditional visual materials, deaf learners have trouble in following traditional (spoken) lectures, people with motion disabilities have problems in attending on-site programs. As already mentioned, the use of wireless technologies and pervasive computing may really enhance the educational learner experience by offering mobile e-learning services that can be accessed by handheld devices. This new paradigm of educational content distribution maximizes the benefits for learners since it enables users to overcome constraints imposed by the surrounding environment. While certainly helpful for users without disabilities, we believe that the use of newmobile technologies may also become a fundamental tool for impaired learners, since it frees them from sitting in front of a PC. In this way, educational activities can be enjoyed by all the users, without hindrance, thus increasing the social inclusion of non-typical learners. While the provision of fully accessible and portable video-lectures may be extremely useful for students, it is widely recognized that structuring and managing rich media contents for mobile learning services are complex and expensive tasks. Indeed, major difficulties originate from the basic need to provide a textual equivalent for each media resource composing a rich media Learning Object (LO). Moreover, tests need to be carried out to establish whether a given LO is fully accessible to all kinds of learners. Unfortunately, both these tasks are truly time-consuming processes, depending on the type of contents the teacher is writing and on the authoring tool he/she is using. Due to these difficulties, online LOs are often distributed as partially accessible or totally inaccessible content. Bearing this in mind, this thesis aims to discuss the key issues of a system we have developed to deliver accessible, customized or nomadic learning experiences to learners with different access needs and skills. To reduce the risk of excluding users with particular access capabilities, our system exploits Learning Objects (LOs) which are dynamically adapted and transcoded based on the specific needs of non-typical users and on the barriers that they can encounter in the environment. The basic idea is to dynamically adapt contents, by selecting them from a set of media resources packaged in SCORM-compliant LOs and stored in a self-adapting format. The system schedules and orchestrates a set of transcoding processes based on specific learner needs, so as to produce a customized LO that can be fully enjoyed by any (impaired or mobile) student.
Resumo:
Introduction 1.1 Occurrence of polycyclic aromatic hydrocarbons (PAH) in the environment Worldwide industrial and agricultural developments have released a large number of natural and synthetic hazardous compounds into the environment due to careless waste disposal, illegal waste dumping and accidental spills. As a result, there are numerous sites in the world that require cleanup of soils and groundwater. Polycyclic aromatic hydrocarbons (PAHs) are one of the major groups of these contaminants (Da Silva et al., 2003). PAHs constitute a diverse class of organic compounds consisting of two or more aromatic rings with various structural configurations (Prabhu and Phale, 2003). Being a derivative of benzene, PAHs are thermodynamically stable. In addition, these chemicals tend to adhere to particle surfaces, such as soils, because of their low water solubility and strong hydrophobicity, and this results in greater persistence under natural conditions. This persistence coupled with their potential carcinogenicity makes PAHs problematic environmental contaminants (Cerniglia, 1992; Sutherland, 1992). PAHs are widely found in high concentrations at many industrial sites, particularly those associated with petroleum, gas production and wood preserving industries (Wilson and Jones, 1993). 1.2 Remediation technologies Conventional techniques used for the remediation of soil polluted with organic contaminants include excavation of the contaminated soil and disposal to a landfill or capping - containment - of the contaminated areas of a site. These methods have some drawbacks. The first method simply moves the contamination elsewhere and may create significant risks in the excavation, handling and transport of hazardous material. Additionally, it is very difficult and increasingly expensive to find new landfill sites for the final disposal of the material. The cap and containment method is only an interim solution since the contamination remains on site, requiring monitoring and maintenance of the isolation barriers long into the future, with all the associated costs and potential liability. A better approach than these traditional methods is to completely destroy the pollutants, if possible, or transform them into harmless substances. Some technologies that have been used are high-temperature incineration and various types of chemical decomposition (for example, base-catalyzed dechlorination, UV oxidation). However, these methods have significant disadvantages, principally their technological complexity, high cost , and the lack of public acceptance. Bioremediation, on the contrast, is a promising option for the complete removal and destruction of contaminants. 1.3 Bioremediation of PAH contaminated soil & groundwater Bioremediation is the use of living organisms, primarily microorganisms, to degrade or detoxify hazardous wastes into harmless substances such as carbon dioxide, water and cell biomass Most PAHs are biodegradable unter natural conditions (Da Silva et al., 2003; Meysami and Baheri, 2003) and bioremediation for cleanup of PAH wastes has been extensively studied at both laboratory and commercial levels- It has been implemented at a number of contaminated sites, including the cleanup of the Exxon Valdez oil spill in Prince William Sound, Alaska in 1989, the Mega Borg spill off the Texas coast in 1990 and the Burgan Oil Field, Kuwait in 1994 (Purwaningsih, 2002). Different strategies for PAH bioremediation, such as in situ , ex situ or on site bioremediation were developed in recent years. In situ bioremediation is a technique that is applied to soil and groundwater at the site without removing the contaminated soil or groundwater, based on the provision of optimum conditions for microbiological contaminant breakdown.. Ex situ bioremediation of PAHs, on the other hand, is a technique applied to soil and groundwater which has been removed from the site via excavation (soil) or pumping (water). Hazardous contaminants are converted in controlled bioreactors into harmless compounds in an efficient manner. 1.4 Bioavailability of PAH in the subsurface Frequently, PAH contamination in the environment is occurs as contaminants that are sorbed onto soilparticles rather than in phase (NAPL, non aqueous phase liquids). It is known that the biodegradation rate of most PAHs sorbed onto soil is far lower than rates measured in solution cultures of microorganisms with pure solid pollutants (Alexander and Scow, 1989; Hamaker, 1972). It is generally believed that only that fraction of PAHs dissolved in the solution can be metabolized by microorganisms in soil. The amount of contaminant that can be readily taken up and degraded by microorganisms is defined as bioavailability (Bosma et al., 1997; Maier, 2000). Two phenomena have been suggested to cause the low bioavailability of PAHs in soil (Danielsson, 2000). The first one is strong adsorption of the contaminants to the soil constituents which then leads to very slow release rates of contaminants to the aqueous phase. Sorption is often well correlated with soil organic matter content (Means, 1980) and significantly reduces biodegradation (Manilal and Alexander, 1991). The second phenomenon is slow mass transfer of pollutants, such as pore diffusion in the soil aggregates or diffusion in the organic matter in the soil. The complex set of these physical, chemical and biological processes is schematically illustrated in Figure 1. As shown in Figure 1, biodegradation processes are taking place in the soil solution while diffusion processes occur in the narrow pores in and between soil aggregates (Danielsson, 2000). Seemingly contradictory studies can be found in the literature that indicate the rate and final extent of metabolism may be either lower or higher for sorbed PAHs by soil than those for pure PAHs (Van Loosdrecht et al., 1990). These contrasting results demonstrate that the bioavailability of organic contaminants sorbed onto soil is far from being well understood. Besides bioavailability, there are several other factors influencing the rate and extent of biodegradation of PAHs in soil including microbial population characteristics, physical and chemical properties of PAHs and environmental factors (temperature, moisture, pH, degree of contamination). Figure 1: Schematic diagram showing possible rate-limiting processes during bioremediation of hydrophobic organic contaminants in a contaminated soil-water system (not to scale) (Danielsson, 2000). 1.5 Increasing the bioavailability of PAH in soil Attempts to improve the biodegradation of PAHs in soil by increasing their bioavailability include the use of surfactants , solvents or solubility enhancers.. However, introduction of synthetic surfactant may result in the addition of one more pollutant. (Wang and Brusseau, 1993).A study conducted by Mulder et al. showed that the introduction of hydropropyl-ß-cyclodextrin (HPCD), a well-known PAH solubility enhancer, significantly increased the solubilization of PAHs although it did not improve the biodegradation rate of PAHs (Mulder et al., 1998), indicating that further research is required in order to develop a feasible and efficient remediation method. Enhancing the extent of PAHs mass transfer from the soil phase to the liquid might prove an efficient and environmentally low-risk alternative way of addressing the problem of slow PAH biodegradation in soil.
Resumo:
Many research fields are pushing the engineering of large-scale, mobile, and open systems towards the adoption of techniques inspired by self-organisation: pervasive computing, but also distributed artificial intelligence, multi-agent systems, social networks, peer-topeer and grid architectures exploit adaptive techniques to make global system properties emerge in spite of the unpredictability of interactions and behaviour. Such a trend is visible also in coordination models and languages, whenever a coordination infrastructure needs to cope with managing interactions in highly dynamic and unpredictable environments. As a consequence, self-organisation can be regarded as a feasible metaphor to define a radically new conceptual coordination framework. The resulting framework defines a novel coordination paradigm, called self-organising coordination, based on the idea of spreading coordination media over the network, and charge them with services to manage interactions based on local criteria, resulting in the emergence of desired and fruitful global coordination properties of the system. Features like topology, locality, time-reactiveness, and stochastic behaviour play a key role in both the definition of such a conceptual framework and the consequent development of self-organising coordination services. According to this framework, the thesis presents several self-organising coordination techniques developed during the PhD course, mainly concerning data distribution in tuplespace-based coordination systems. Some of these techniques have been also implemented in ReSpecT, a coordination language for tuple spaces, based on logic tuples and reactions to events occurring in a tuple space. In addition, the key role played by simulation and formal verification has been investigated, leading to analysing how automatic verification techniques like probabilistic model checking can be exploited in order to formally prove the emergence of desired behaviours when dealing with coordination approaches based on self-organisation. To this end, a concrete case study is presented and discussed.
Resumo:
We investigate the statics and dynamics of a glassy,non-entangled, short bead-spring polymer melt with moleculardynamics simulations. Temperature ranges from slightlyabove the mode-coupling critical temperature to the liquidregime where features of a glassy liquid are absent. Ouraim is to work out the polymer specific effects on therelaxation and particle correlation. We find the intra-chain static structure unaffected bytemperature, it depends only on the distance of monomersalong the backbone. In contrast, the distinct inter-chainstructure shows pronounced site-dependence effects at thelength-scales of the chain and the nearest neighbordistance. There, we also find the strongest temperaturedependence which drives the glass transition. Both the siteaveraged coupling of the monomer and center of mass (CM) andthe CM-CM coupling are weak and presumably not responsiblefor a peak in the coherent relaxation time at the chain'slength scale. Chains rather emerge as soft, easilyinterpenetrating objects. Three particle correlations arewell reproduced by the convolution approximation with theexception of model dependent deviations. In the spatially heterogeneous dynamics of our system weidentify highly mobile monomers which tend to follow eachother in one-dimensional paths forming ``strings''. Thesestrings have an exponential length distribution and aregenerally short compared to the chain length. Thus, arelaxation mechanism in which neighboring mobile monomersmove along the backbone of the chain seems unlikely.However, the correlation of bonded neighbors is enhanced. When liquids are confined between two surfaces in relativesliding motion kinetic friction is observed. We study ageneric model setup by molecular dynamics simulations for awide range of sliding speeds, temperatures, loads, andlubricant coverings for simple and molecular fluids. Instabilities in the particle trajectories are identified asthe origin of kinetic friction. They lead to high particlevelocities of fluid atoms which are gradually dissipatedresulting in a friction force. In commensurate systemsfluid atoms follow continuous trajectories for sub-monolayercoverings and consequently, friction vanishes at low slidingspeeds. For incommensurate systems the velocity probabilitydistribution exhibits approximately exponential tails. Weconnect this velocity distribution to the kinetic frictionforce which reaches a constant value at low sliding speeds. This approach agrees well with the friction obtaineddirectly from simulations and explains Amontons' law on themicroscopic level. Molecular bonds in commensurate systemslead to incommensurate behavior, but do not change thequalitative behavior of incommensurate systems. However,crossed chains form stable load bearing asperities whichstrongly increase friction.
Resumo:
Somatostatin ist ein Molekül mit multifunktinonellem Charakter, dem Neurotransmitter-, Neuromodulator- und (Neuro)-Hormoneigenschaften zugeschrieben werden. Gemäß seiner ubiquitären Verteilung in Geweben beeinflusst es Stoffwechsel- und Entwicklungsprozesse, bis hin zu Lern-und Gedächtnisleistungen. Diese Wirkungen resultieren aus dem lokalen und zeitlichen Zusammenspiel eines Liganden und fünf G-Protein gekoppelter Rezeptoren (SSTR1-5). Zur Charakterisierung der biologischen Bedeutung des Somatostatin-Systems im Gesamtorganismus wurde eine Mutationsanalyse einzelner Systemkomponenten durchgeführt. Sie umfaßte die Inaktivierung der Gene für das Somatostatin-Präpropeptid und die der Rezeptoren SSTR3 und SSTR4 durch Gene Targeting. Die entsprechenden Ausfallmutationen belegen: Weder die Rezeptoren 3 und 4, noch Somatostatin sind für das Überleben des Organismus unter Standardhaltungsbedingungen notwendig. Die entsprechenden Mauslinien zeigen keine unmittelbar auffälligen Einschränkungen ihrer Biologie. Die Somatostatin-Nullmaus wurde zum Hauptgegenstand einer detaillierten Untersuchung aufgrund der übergeordneten Position des Liganden in der Signalkaskade und verfügbaren Hinweisen zu seiner Funktion. Folgende Schlußfolgerungen konnten nach eingehender Analyse gezogen werden: Der Ausfall des Somatostatin-Gens hat erhöhte Plasmakonzentrationen an Wachstumshormon (GH) zur Konsequenz. Dies steht im Einklang mit der Rolle Somatostatins als hemmender Faktor der Wachstumshormon-Freisetzung, die in der Mutante aufgehoben ist. Durch die Somatostatin-Nullmaus wurde zudem deutlich: Somatostatin interagiert als wesentliches Bindeglied zwischen der Wachstums- und Streßachse. Permanent erhöhte Corticosteron-Werte in den Mutanten implizieren einen negativen tonischen Einfluß für die Sekretion von Glukocorticoiden in vivo. Damit zeigt die Knockout-Maus, daß Somatostatin normalerweise als ein entscheidendes inhibierendes Kontrollelement der Steroidfreisetzung fungiert. Verhaltensversuche offenbarten ein Defizit im motorischen Lernen. Somatostatin-Nullmäuse bleiben im Lernparadigma “Rotierender Stabtest” hinter ihren Artgenossen zurück ohne aber generell in Motorik oder Koordination eingeschränkt zu sein. Diese motorischen Lernvorgänge sind von einem funktionierenden Kleinhirn abhängig. Da Somatostatin und seine Rezeptoren kaum im adulten, wohl aber im sich entwickelnden Kleinhirn auftreten, belegt dieses Ergebnis die Funktion transient in der Entwicklung exprimierter Neuropeptide – eine lang bestehende, aber bislang experimentell nicht nachgewiesene Hypothese. Die Überprüfung weiterer physiologischer Parameter und Verhaltenskategorien unter Standard-Laborbedingunggen ergab keine sichtbaren Abweichungen im Vergleich zu Wildtyp-Mäusen. Damit steht nun ein Tiermodell zur weiterführenden Analyse für die Somatostatin-Forschung bereit: In endokrinologischen, elektrophysiologischen und verhaltens-biologischen Experimenten ist nun eine unmittelbare Korrelation selektiv mit dem Somatostatin-Peptid bzw. mit den Rezeptoren 3 und 4 aber auch in Kombination der Ausfallmutationen nach entsprechenden Kreuzungen möglich.
Resumo:
The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.
Resumo:
The ever increasing demand for new services from users who want high-quality broadband services while on the move, is straining the efficiency of current spectrum allocation paradigms, leading to an overall feeling of spectrum scarcity. In order to circumvent this problem, two possible solutions are being investigated: (i) implementing new technologies capable of accessing the temporarily/locally unused bands, without interfering with the licensed services, like Cognitive Radios; (ii) release some spectrum bands thanks to new services providing higher spectral efficiency, e.g., DVB-T, and allocate them to new wireless systems. These two approaches are promising, but also pose novel coexistence and interference management challenges to deal with. In particular, the deployment of devices such as Cognitive Radio, characterized by the inherent unplanned, irregular and random locations of the network nodes, require advanced mathematical techniques in order to explicitly model their spatial distribution. In such context, the system performance and optimization are strongly dependent on this spatial configuration. On the other hand, allocating some released spectrum bands to other wireless services poses severe coexistence issues with all the pre-existing services on the same or adjacent spectrum bands. In this thesis, these methodologies for better spectrum usage are investigated. In particular, using Stochastic Geometry theory, a novel mathematical framework is introduced for cognitive networks, providing a closed-form expression for coverage probability and a single-integral form for average downlink rate and Average Symbol Error Probability. Then, focusing on more regulatory aspects, interference challenges between DVB-T and LTE systems are analysed proposing a versatile methodology for their proper coexistence. Moreover, the studies performed inside the CEPT SE43 working group on the amount of spectrum potentially available to Cognitive Radios and an analysis of the Hidden Node problem are provided. Finally, a study on the extension of cognitive technologies to Hybrid Satellite Terrestrial Systems is proposed.
Resumo:
In this thesis I concentrate on the angular correlations in top quark decays and their next--to--leading order (NLO) QCD corrections. I also discuss the leading--order (LO) angular correlations in unpolarized and polarized hyperon decays. In the first part of the thesis I calculate the angular correlation between the top quark spin and the momentum of decay products in the rest frame decay of a polarized top quark into a charged Higgs boson and a bottom quark in Two-Higgs-Doublet-Models: $t(uparrow)rightarrow b+H^{+}$. The decay rate in this process is split into an angular independent part (unpolarized) and an angular dependent part (polar correlation). I provide closed form formulae for the ${mathcal O}(alpha_{s})$ radiative corrections to the unpolarized and the polar correlation functions for $m_{b}neq 0$ and $m_{b}=0$. The results for the unpolarized rate agree with the existing results in the literature. The results for the polarized correlations are new. I found that, for certain values of $tanbeta$, the ${mathcal O}(alpha_s)$ radiative corrections to the unpolarized, polarized rates, and the asymmetry parameter can become quite large. In the second part I concentrate on the semileptonic rest frame decay of a polarized top quark into a bottom quark and a lepton pair: $t(uparrow) to X_b + ell^+ + nu_ell$. I analyze the angular correlations between the top quark spin and the momenta of the decay products in two different helicity coordinate systems: system 1a with the $z$--axis along the charged lepton momentum, and system 3a with the $z$--axis along the neutrino momentum. The decay rate then splits into an angular independent part (unpolarized), a polar angle dependent part (polar correlation) and an azimuthal angle dependent part (azimuthal correlation). I present closed form expressions for the ${mathcal O}(alpha_{s})$ radiative corrections to the unpolarized part and the polar and azimuthal correlations in system 1a and 3a for $m_{b}neq 0$ and $m_{b}=0$. For the unpolarized part and the polar correlation I agree with existing results. My results for the azimuthal correlations are new. In system 1a I found that the azimuthal correlation vanishes in the leading order as a consequence of the $(V-A)$ nature of the Standard Model current. The ${mathcal O}(alpha_{s})$ radiative corrections to the azimuthal correlation in system 1a are very small (around 0.24% relative to the unpolarized LO rate). In system 3a the azimuthal correlation does not vanish at LO. The ${mathcal O}(alpha_{s})$ radiative corrections decreases the LO azimuthal asymmetry by around 1%. In the last part I turn to the angular distribution in semileptonic hyperon decays. Using the helicity method I derive complete formulas for the leading order joint angular decay distributions occurring in semileptonic hyperon decays including lepton mass and polarization effects. Compared to the traditional covariant calculation the helicity method allows one to organize the calculation of the angular decay distributions in a very compact and efficient way. This is demonstrated by the specific example of the polarized hyperon decay $Xi^0(uparrow) to Sigma^+ + l^- + bar{nu}_l$ ,($l^-=e^-, mu^-$) followed by the nonleptonic decay $Sigma^+ to p + pi^0$, which is described by a five--fold angular decay distribution.
Effect of drug physicochemical properties on the release from liposomal systems in vitro and in vivo
Resumo:
Liposomes were discovered about 40 years ago by A. Bangham and since then they became very versatile tools in biology, biochemistry and medicine. Liposomes are the smallest artificial vesicles of spherical shape that can be produced from natural untoxic phospholipids and cholesterol. Liposome vesicles can be used as drug carriers and become loaded with a great variety of molecules, such as small drug molecules, proteins, nucleotides and even plasmids. Due to the variability of liposomal compositions they can be used for a large number of applications. In this thesis the β-adrenoceptor antagonists propranolol, metoprolol, atenolol and pindolol, glucose, 18F-Fluorodeoxyglucose (FDG) and Er-DTPA were used for encapsulation in liposomes, characterization and in vitro release studies. Multilamellar vesicles (MLV), large unilamellar vesicles (LUV) and smaller unilamellar vesicles (SUV) were prepared using one of the following lipids: 1,2-Dimyristoyl-sn-Glycero-3-Phosphocholine (DMPC), 1,2-Distearoyl-sn-Glycero-3-Phosphocholine (DSPC), Phospholipone 90H (Ph90H) or a mixture of DSPC and DMPC (1:1). The freeze thawing method was used for preparation of liposomes because it has three advantages (1) avoiding the use of chloroform, which is used in other methods and causes toxicity (2) it is a simple method and (3) it gives high entrapping efficiency. The percentage of entrapping efficiencies (EE) was different depending on the type and phase transition temperature (Tc) of the lipid used. The average particle size and particle size distribution of the prepared liposomes were determined using both dynamic light scattering (DLS) and laser diffraction analyzer (LDA). The average particle size of the prepared liposomes differs according to both liposomal type and lipid type. Dispersion and dialysis techniques were used for the study of the in vitro release of β-adrenoceptor antagonists. The in vitro release rate of β-adrenoceptor antagonists was increased from MLV to LUV to SUV. Regarding the lipid type, β-adrenoceptor antagonists exhibited different in vitro release pattern from one lipid to another. Two different concentrations (50 and 100mg/ml) of Ph90H were used for studying the effect of lipid concentration on the in vitro release of β-adrenoceptor antagonists. It was found that liposomes made from 50 mg/ml Ph90H exhibited higher release rates than liposomes made at 100 mg/ml Ph90H. Also glucose was encapsulated in MLV, LUV and SUV using 1,2-Dimyristoyl-sn-Glycero-3-Phosphocholine (DMPC), 1,2-Distearoyl-sn-Glycero-3-Phosphocholine (DSPC), Phospholipone 90H (Ph90H), soybean lipid (Syb) or a mixture of DSPC and DMPC (1:1). The average particle size and size distribution were determined using laser diffraction analysis. It was found that both EE and average particle size differ depending on both lipid and liposomal types. The in vitro release of glucose from different types of liposomes was performed using a dispersion method. It was found that the in vitro release of glucose from different liposomes is dependent on the lipid type. 18F-FDG was encapsulated in MLV 1,2-Dimyristoyl-sn-Glycero-3-Phosphocholine (DMPC), 1,2-Distearoyl-sn-Glycero-3-Phosphocholine (DSPC), Phospholipone 90H (Ph90H), soybean lipid (Syb) or a mixture of DSPC and DMPC (1:1). FDG-containing LUV and SUV were prepared using Ph90H lipid. The in vitro release of FDG from the different types of lipids was accomplished using a dispersion method. Results similar to that of glucose release were obtained. In vivo imaging of FDG in both uncapsulated FDG and FDG-containing MLV was performed in the brain and the whole body of rats using PET scanner. It was found that the release of FDG from FDG-containing MLV was sustained. In vitro-In vivo correlation was studied using the in vitro release data of FDG from liposomes and in vivo absorption data of FDG from injected liposomes using microPET. Erbium, which is a lanthanide metal, was used as a chelate with DTPA for encapsulation in SUV liposomes for the indirect radiation therapy of cancer. The liposomes were prepared using three different concentrations of soybean lipid (30, 50 and 70 mg/ml). The stability of Er-DTPA SUV liposomes was carried out by storage of the prepared liposomes at three different temperatures (4, 25 and 37 °C). It was found that the release of Er-DTPA complex is temperature dependent, the higher the temperature, the higher the release. There was an inverse relationship between the release of the Er-DTPA complex and the concentration of lipid.
Resumo:
During the last decade peach and nectarine fruit have lost considerable market share, due to increased consumer dissatisfaction with quality at retail markets. This is mainly due to harvesting of too immature fruit and high ripening heterogeneity. The main problem is that the traditional used maturity indexes are not able to objectively detect fruit maturity stage, neither the variability present in the field, leading to a difficult post-harvest management of the product and to high fruit losses. To assess more precisely the fruit ripening other techniques and devices can be used. Recently, a new non-destructive maturity index, based on the vis-NIR technology, the Index of Absorbance Difference (IAD), that correlates with fruit degreening and ethylene production, was introduced and the IAD was used to study peach and nectarine fruit ripening from the “field to the fork”. In order to choose the best techniques to improve fruit quality, a detailed description of the tree structure, of fruit distribution and ripening evolution on the tree was faced. More in details, an architectural model (PlantToon®) was used to design the tree structure and the IAD was applied to characterize the maturity stage of each fruit. Their combined use provided an objective and precise evaluation of the fruit ripening variability, related to different training systems, crop load, fruit exposure and internal temperature. Based on simple field assessment of fruit maturity (as IAD) and growth, a model for an early prediction of harvest date and yield, was developed and validated. The relationship between the non-destructive maturity IAD, and the fruit shelf-life, was also confirmed. Finally the obtained results were validated by consumer test: the fruit sorted in different maturity classes obtained a different consumer acceptance. The improved knowledge, leaded to an innovative management of peach and nectarine fruit, from “field to market”.
Resumo:
This work comprises three different types of unconventional correlated systems.rnChapters 3-5 of this work are about the open shell compounds Rb4O6 and Cs4O6. These mixed valent compounds contain oxygen in two different modifications: the closed-shell peroxide anion is nonmagnetic, whereas the hyperoxide anion contains an unpaired electrons in an antibonding pi*-orbital. Due to this electron magnetic ordering is rendered possible. In contrast to theoretical predictions, which suggested half-metallic ferromagnetism for Rb4O6,rndominating antiferromagnetic interactions were found in the experiment. Besidesrna symmetry reduction due to the mixed valency, strong electronic correlations of this highly molecular system determine its properties; it is a magnetically frustrated insulator. The corresponding Cs4O6 was found to show similar properties.rnChapters 6-9 of this work are about intermetallic Heusler superconductors. rnAll of these superconductors were rationally designed using the van Hove scenario as a working recipe. A saddle point in the energy dispersion curve of a solid leads to a van Hove singularity in the density of states. In the Ni-based and Pd-based Heusler superconductors presented in this work this sort of a valence instability occurs at the high-symmetry L point and coincides or nearly coincides with the Fermi level. The compounds escape the high density of states at the Fermi energy through a transition into the correlated superconducting state.rnChapter 10 of this work is about the tetragonally distorted ferrimagnetic DO22 phase of Mn3Ga. This hard-magnetic modification is technologically useful for spin torque transfer applications. The phase exhibits two different crystallographic sites that are occupied by Mn atoms and can thus be written as Mn2MnGa. The competition between the mainly itinerant moments of the Mn atoms at the Wyckoff position 4d and the localized moments of the Mn atoms at the Wyckoff position 2b leads to magnetic correlations. The antiferromagnetic orientation of these moments determines the compound to exhibit a resulting magnetic moment of approximately 1 µB per formula unit in a partially compensated ferrimagnetic configuration.