928 resultados para MACROSCOPIC QUANTUM PHENOMENA IN MAGNETIC SYSTEMS
Resumo:
In this paper we study several natural and man-made complex phenomena in the perspective of dynamical systems. For each class of phenomena, the system outputs are time-series records obtained in identical conditions. The time-series are viewed as manifestations of the system behavior and are processed for analyzing the system dynamics. First, we use the Fourier transform to process the data and we approximate the amplitude spectra by means of power law functions. We interpret the power law parameters as a phenomenological signature of the system dynamics. Second, we adopt the techniques of non-hierarchical clustering and multidimensional scaling to visualize hidden relationships between the complex phenomena. Third, we propose a vector field based analogy to interpret the patterns unveiled by the PL parameters.
Resumo:
Based on the report for the unit “Project III” of the PhD programme on Technology Assessment in 2011. The unit was supervised by Prof. António B. Moniz (FCT-UNL).
Resumo:
This study is specifically concerned with the effect of the Enterprise Resource Planning (ERP) on the Business Process Redesign (BPR). Researcher’s experience and the investigation on previous researches imply that BPR and ERP are deeply related to each other and a study to found the mentioned relation further is necessary. In order to elaborate the hypothesis, a case study, in particular Turkish electricity distribution market and the phase of privatization are investigated. Eight companies that have taken part in privatization process and executed BPR serve as cases in this study. During the research, the cases are evaluated through critical success factors on both BPR and ERP. It was seen that combining the ERP Solution features with business processes lead the companies to be successful in ERP and BPR implementation. When the companies’ success and efficiency were compared before and after the ERP implementation, a considerable change was observed in organizational structure. It was spotted that the team composition is important in the success of ERP projects. Additionally, when the ERP is in driver or enabler role, the companies can be considered successful. On the contrary, when the ERP has a neutral role of business processes, the project fails. In conclusion, it can be said that the companies, which have implemented the ERP successfully, have accomplished the goals of the BPR.
Resumo:
Information technologies changed the way of how the health organizations work, contributing to their effectiveness, efficiency and sustainability. Hospital Information Systems (HIS) are emerging on all of health institutions, helping health professionals and patients. However, HIS are not always implemented and used in the best way, leading to low levels of benefits and acceptance by users of these systems. In order to mitigate this problem, it is essential to take measures able to ensure if the HIS and their interfaces are designed in a simple and interactive way. With this in mind, a study to measure the user satisfaction and their opinion was made. It was applied the Technology Acceptance Model (TAM) on a HIS implemented on various hospital centers (AIDA), being used the Pathologic Anatomy Service. The study identified weakness and strengths features of AIDA and it pointed some solutions to improve the medical record.
Resumo:
IDENTIFICACIÓN DEL PROBLEMA DE ESTUDIO. Las sustancias orgánicas solubles en agua no biodegradables tales como ciertos herbicidas, colorantes industriales y metabolitos de fármacos de uso masivo son una de las principales fuentes de contaminación en aguas subterráneas de zonas agrícolas y en efluentes industriales y domésticos. Las reacciones fotocatalizadas por irradiación UV-visible y sensitizadores orgánicos e inorgánicos son uno de los métodos más económicos y convenientes para la descomposición de contaminantes en subproductos inocuos y/o biodegradables. En muchas aplicaciones es deseable un alto grado de especificidad, efectividad y velocidad de degradación de un dado agente contaminante que se encuentra presente en una mezcla compleja de sustancias orgánicas en solución. En particular son altamente deseables sistemas nano/micro -particulados que formen suspensiones acuosas estables debido a que estas permiten una fácil aplicación y una eficaz acción descontaminante en grandes volúmenes de fluidos. HIPÓTESIS Y PLANTEO DE LOS OBJETIVOS. El objetivo general de este proyecto es desarrollar sistemas nano/micro particulados formados por polímeros de impresión molecular (PIMs) y foto-sensibilizadores (FS). Un PIMs es un polímero especialmente sintetizado para que sea capaz de reconocer específicamente un analito (molécula plantilla) determinado. La actividad de unión específica de los PIMs en conjunto con la capacidad fotocatalizadora de los sensibilizadores pueden ser usadas para lograr la fotodescomposición específica de moléculas “plantilla” (en este caso un dado contaminante) en soluciones conteniendo mezclas complejas de sustancias orgánicas. MATERIALES Y MÉTODOS A UTILIZAR. Se utilizaran técnicas de polimerización en mini-emulsión para sintetizar los sistemas nano/micro PIM-FS para buscar la degradación de ciertos compuestos de interés. Para caracterizar eficiencias, mecanismos y especificidad de foto-degradación en dichos sistemas se utilizan diversas técnicas espectroscópicas (estacionarias y resueltas en el tiempo) y de cromatografía (HPLC y GC). Así mismo, para medir directamente distribuciones de afinidades de unión y eficiencia de foto-degradación se utilizaran técnicas de fluorescencia de molécula/partícula individual. Estas determinaciones permitirán obtener resultados importantes al momento de analizar los factores que afectan la eficiencia de foto-degradación (nano/micro escala), tales como cantidad y ubicación de foto- sensibilizadores en las matrices poliméricas y eficiencia de unión de la plantilla y los productos de degradación al PIM. RESULTADOS ESPERADOS. Los estudios propuestos apuntan a un mejor entendimiento de procesos foto-iniciados en entornos nano/micro-particulados para aplicar dichos conocimientos al diseño de sistemas optimizados para la foto-destrucción selectiva de contaminantes acuosos de relevancia social; tales como herbicidas, residuos industriales, metabolitos de fármacos de uso masivo, etc. IMPORTANCIA DEL PROYECTO. Los sistemas nano/micro-particulados PIM-FS que se propone desarrollar en este proyecto se presentan como candidatos ideales para tratamientos específicos de efluentes industriales y domésticos en los cuales se desea lograr la degradación selectiva de compuestos orgánicos. Los conocimientos adquiridos serán indispensables para construir una plataforma versátil de sistemas foto-catalíticos específicos para la degradación de diversos contaminantes orgánicos de interés social. En lo referente a la formación de recursos humanos, el proyecto propuesto contribuirá en forma directa a la formación de 3 estudiantes de postgrado y 2 estudiantes de grado. En las capacidades institucionales se contribuirá al acondicionamiento del Laboratorio para Microscopía Óptica Avanzada (LMOA) en el Dpto. de Química de la UNRC y al montaje de un sistema de microscopio de fluorescencia que permitirá la aplicación de técnicas avanzadas de espectroscopia de fluorescencia de molecula individual. Water-soluble organic molecules such as certain non-biodegradable herbicides, industrial dyes and metabolites of widespread use drugs are a major source of pollution in groundwater from agricultural areas and in industrial and domestic effluents. Photo-catalytic reactions by UV-visible irradiation and organic sensitizers are one of the most economical and convenient methods for the decomposition of pollutants into harmless byproducts. In many applications it is highly desirable a high degree of specificity, effectiveness and speed of degradation of specific pollutants present in a complex mixture. In particular nano/micro-particles systems that form stable aqueous suspensions are highly desirable because they allow for easy application and effective decontamination of large volumes of fluids. Herein we propose the development of nano/micro particles composed by molecularly imprinted polymers (MIP) and photo-sensitizers (PS). The specific binding of MIP and the photo-catalytic ability of the sensitizers are used to achieve the photo-decomposition of specific "template" molecules in complex mixtures. Mini-emulsion polymerization techniques will be used to synthesize nano/micro MIP-FS systems. Spectroscopy (steady-state and time resolved) and chromatography (GC and HPLC) will be used to characterize efficiency, mechanisms and specificity of photo-degradation in these systems. In addition single molecule/particle fluorescence spectroscopy techniques will be used to directly measure distributions of binding affinities and photo-degradation efficiency in individual particles. The proposed studies point to a more detailed understanding of the factors affecting the photo-degradation efficiency in nano/micro-particles and to apply that knowledge in the design of optimized systems for photo-selective destruction of socially relevant aqueous pollutants.
Resumo:
Report for the scientific sojourn at the the Philipps-Universität Marburg, Germany, from september to december 2007. For the first, we employed the Energy-Decomposition Analysis (EDA) to investigate aromaticity on Fischer carbenes as it is related through all the reaction mechanisms studied in my PhD thesis. This powerful tool, compared with other well-known aromaticity indices in the literature like NICS, is useful not only for quantitative results but also to measure the degree of conjugation or hyperconjugation in molecules. Our results showed for the annelated benzenoid systems studied here, that electron density is more concentrated on the outer rings than in the central one. The strain-induced bond localization plays a major role as a driven force to keep the more substituted ring as the less aromatic. The discussion presented in this work was contrasted at different levels of theory to calibrate the method and ensure the consistency of our results. We think these conclusions can also be extended to arene chemistry for explaining aromaticity and regioselectivity reactions found in those systems.In the second work, we have employed the Turbomole program package and density-functionals of the best performance in the state of art, to explore reaction mechanisms in the noble gas chemistry. Particularly, we were interested in compounds of the form H--Ng--Ng--F (where Ng (Noble Gas) = Ar, Kr and Xe) and we investigated the relative stability of these species. Our quantum chemical calculations predict that the dixenon compound HXeXeF has an activation barrier for decomposition of 11 kcal/mol which should be large enough to identify the molecule in a low-temperature matrix. The other noble gases present lower activation barriers and therefore are more labile and difficult to be observable systems experimentally.
Resumo:
This paper is concerned with the modeling and analysis of quantum dissipation phenomena in the Schrödinger picture. More precisely, we do investigate in detail a dissipative, nonlinear Schrödinger equation somehow accounting for quantum Fokker–Planck effects, and how it is drastically reduced to a simpler logarithmic equation via a nonlinear gauge transformation in such a way that the physics underlying both problems keeps unaltered. From a mathematical viewpoint, this allows for a more achievable analysis regarding the local wellposedness of the initial–boundary value problem. This simplification requires the performance of the polar (modulus–argument) decomposition of the wavefunction, which is rigorously attained (for the first time to the best of our knowledge) under quite reasonable assumptions.
Resumo:
Fault location has been studied deeply for transmission lines due to its importance in power systems. Nowadays the problem of fault location on distribution systems is receiving special attention mainly because of the power quality regulations. In this context, this paper presents an application software developed in Matlabtrade that automatically calculates the location of a fault in a distribution power system, starting from voltages and currents measured at the line terminal and the model of the distribution power system data. The application is based on a N-ary tree structure, which is suitable to be used in this application due to the highly branched and the non- homogeneity nature of the distribution systems, and has been developed for single-phase, two-phase, two-phase-to-ground, and three-phase faults. The implemented application is tested by using fault data in a real electrical distribution power system
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
One of the main goals in radiobiology research is to enhance radiotherapy effectiveness without provoking any increase in toxicity. In this context, it has been proposed that electromagnetic fields (EMFs), known to be modulators of proliferation rate, enhancers of apoptosis and inductors of genotoxicity, might control tumor recruitment and, thus, provide therapeutic benefits. Scientific evidence shows that the effects of ionizing radiation on cellular compartments and functions are strengthened by EMF. Although little is known about the potential role of EMFs in radiotherapy (RT), the radiosensitizing effect of EMFs described in the literature could support their use to improve radiation effectiveness. Thus, we hypothesized that EMF exposure might enhance the ionizing radiation effect on tumor cells, improving the effects of RT. The aim of this paper is to review reports of the effects of EMFs in biological systems and their potential therapeutic benefits in radiotherapy.
Resumo:
A fundamental question in developmental biology is how tissues are patterned to give rise to differentiated body structures with distinct morphologies. The Drosophila wing disc offers an accessible model to understand epithelial spatial patterning. It has been studied extensively using genetic and molecular approaches. Bristle patterns on the thorax, which arise from the medial part of the wing disc, are a classical model of pattern formation, dependent on a pre-pattern of trans-activators and –repressors. Despite of decades of molecular studies, we still only know a subset of the factors that determine the pre-pattern. We are applying a novel and interdisciplinary approach to predict regulatory interactions in this system. It is based on the description of expression patterns by simple logical relations (addition, subtraction, intersection and union) between simple shapes (graphical primitives). Similarities and relations between primitives have been shown to be predictive of regulatory relationships between the corresponding regulatory factors in other Systems, such as the Drosophila egg. Furthermore, they provide the basis for dynamical models of the bristle-patterning network, which enable us to make even more detailed predictions on gene regulation and expression dynamics. We have obtained a data-set of wing disc expression patterns which we are now processing to obtain average expression patterns for each gene. Through triangulation of the images we can transform the expression patterns into vectors which can easily be analysed by Standard clustering methods. These analyses will allow us to identify primitives and regulatory interactions. We expect to identify new regulatory interactions and to understand the basic Dynamics of the regulatory network responsible for thorax patterning. These results will provide us with a better understanding of the rules governing gene regulatory networks in general, and provide the basis for future studies of the evolution of the thorax-patterning network in particular.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.