915 resultados para Resolution of problems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The solution behavior of linear polymer chains is well understood, having been the subject of intense study throughout the previous century. As plastics have become ubiquitous in everyday life, polymer science has grown into a major field of study. The conformation of a polymer in solution depends on the molecular architecture and its interactions with the surroundings. Developments in synthetic techniques have led to the creation of precision-tailored polymeric materials with varied topologies and functionalities. In order to design materials with the desired properties, it is imperative to understand the relationships between polymer architecture and their conformation and behavior. To meet that need, this thesis investigates the conformation and self-assembly of three architecturally complex macromolecular systems with rich and varied behaviors driven by the resolution of intramolecular conflicts. First we describe the development of a robust and facile synthetic approach to reproducible bottlebrush polymers (Chapter 2). The method was used to produce homologous series of bottlebrush polymers with polynorbornene backbones, which revealed the effect of side-chain and backbone length on the overall conformation in both good and theta solvent conditions (Chapter 3). The side-chain conformation was obtained from a series of SANS experiments and determined to be indistinguishable from the behavior of free linear polymer chains. Using deuterium-labeled bottlebrushes, we were able for the first time to directly observe the backbone conformation of a bottlebrush polymer which showed self-avoiding walk behavior. Secondly, a series of SANS experiments was conducted on a homologous series of Side Group Liquid Crystalline Polymers (SGLCPs) in a perdeuterated small molecule liquid crystal (5CB). Monodomain, aligned, dilute samples of SGLCP-b-PS block copolymers were seen to self-assemble into complex micellar structures with mutually orthogonally oriented anisotropies at different length scales (Chapter 4). Finally, we present the results from the first scattering experiments on a set of fuel-soluble, associating telechelic polymers. We observed the formation of supramolecular aggregates in dilute (≤0.5wt%) solutions of telechelic polymers and determined that the choice of solvent has a significant effect on the strength of association and the size of the supramolecules (Chapter 5). A method was developed for the direct estimation of supramolecular aggregation number from SANS data. The insight into structure-property relationships obtained from this work will enable the more targeted development of these molecular architectures for their respective applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis has two major parts. The first part of the thesis will describe a high energy cosmic ray detector -- the High Energy Isotope Spectrometer Telescope (HEIST). HEIST is a large area (0.25 m2sr) balloon-borne isotope spectrometer designed to make high-resolution measurements of isotopes in the element range from neon to nickel (10 ≤ Z ≤ 28) at energies of about 2 GeV/nucleon. The instrument consists of a stack of 12 NaI(Tl) scintilla tors, two Cerenkov counters, and two plastic scintillators. Each of the 2-cm thick NaI disks is viewed by six 1.5-inch photomultipliers whose combined outputs measure the energy deposition in that layer. In addition, the six outputs from each disk are compared to determine the position at which incident nuclei traverse each layer to an accuracy of ~2 mm. The Cerenkov counters, which measure particle velocity, are each viewed by twelve 5-inch photomultipliers using light integration boxes.

HEIST-2 determines the mass of individual nuclei by measuring both the change in the Lorentz factor (Δγ) that results from traversing the NaI stack, and the energy loss (ΔΕ) in the stack. Since the total energy of an isotope is given by Ε = γM, the mass M can be determined by M = ΔΕ/Δγ. The instrument is designed to achieve a typical mass resolution of 0.2 amu.

The second part of this thesis presents an experimental measurement of the isotopic composition of the fragments from the breakup of high energy 40Ar and 56Fe nuclei. Cosmic ray composition studies rely heavily on semi-empirical estimates of the cross-sections for the nuclear fragmentation reactions which alter the composition during propagation through the interstellar medium. Experimentally measured yields of isotopes from the fragmentation of 40Ar and 56Fe are compared with calculated yields based on semi-empirical cross-section formulae. There are two sets of measurements. The first set of measurements, made at the Lawrence Berkeley Laboratory Bevalac using a beam of 287 MeV/nucleon 40Ar incident on a CH2 target, achieves excellent mass resolutionm ≤ 0.2 amu) for isotopes of Mg through K using a Si(Li) detector telescope. The second set of measurements, also made at the Lawrence Berkeley Laboratory Bevalac, using a beam of 583 MeV/nucleon 56FeFe incident on a CH2 target, resolved Cr, Mn, and Fe fragments with a typical mass resolution of ~ 0.25 amu, through the use of the Heavy Isotope Spectrometer Telescope (HIST) which was later carried into space on ISEE-3 in 1978. The general agreement between calculation and experiment is good, but some significant differences are reported here.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The spatial longitudinal coherence length (SLCL), which is determined by the size of and the distance from the source, is introduced to investigate the longitudinal resolution of lensless ghost imaging. Its influence is discussed quantitatively by simulation. The discrepancy of position sensitivity between Scareelli et al. [Appl. Phys. Lett. 88, 061106 (2006)] and Basano and Ottonello [Appl. Phys. Lett. 88, 091109 (2006)] is clarified. (C) 2008 Optical Society of America.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wide field-of-view (FOV) microscopy is of high importance to biological research and clinical diagnosis where a high-throughput screening of samples is needed. This thesis presents the development of several novel wide FOV imaging technologies and demonstrates their capabilities in longitudinal imaging of living organisms, on the scale of viral plaques to live cells and tissues.

The ePetri Dish is a wide FOV on-chip bright-field microscope. Here we applied an ePetri platform for plaque analysis of murine norovirus 1 (MNV-1). The ePetri offers the ability to dynamically track plaques at the individual cell death event level over a wide FOV of 6 mm × 4 mm at 30 min intervals. A density-based clustering algorithm is used to analyze the spatial-temporal distribution of cell death events to identify plaques at their earliest stages. We also demonstrate the capabilities of the ePetri in viral titer count and dynamically monitoring plaque formation, growth, and the influence of antiviral drugs.

We developed another wide FOV imaging technique, the Talbot microscope, for the fluorescence imaging of live cells. The Talbot microscope takes advantage of the Talbot effect and can generate a focal spot array to scan the fluorescence samples directly on-chip. It has a resolution of 1.2 μm and a FOV of ~13 mm2. We further upgraded the Talbot microscope for the long-term time-lapse fluorescence imaging of live cell cultures, and analyzed the cells’ dynamic response to an anticancer drug.

We present two wide FOV endoscopes for tissue imaging, named the AnCam and the PanCam. The AnCam is based on the contact image sensor (CIS) technology, and can scan the whole anal canal within 10 seconds with a resolution of 89 μm, a maximum FOV of 100 mm × 120 mm, and a depth-of-field (DOF) of 0.65 mm. We also demonstrate the performance of the AnCam in whole anal canal imaging in both animal models and real patients. In addition to this, the PanCam is based on a smartphone platform integrated with a panoramic annular lens (PAL), and can capture a FOV of 18 mm × 120 mm in a single shot with a resolution of 100─140 μm. In this work we demonstrate the PanCam’s performance in imaging a stained tissue sample.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stable isotope geochemistry is a valuable toolkit for addressing a broad range of problems in the geosciences. Recent technical advances provide information that was previously unattainable or provide unprecedented precision and accuracy. Two such techniques are site-specific stable isotope mass spectrometry and clumped isotope thermometry. In this thesis, I use site-specific isotope and clumped isotope data to explore natural gas development and carbonate reaction kinetics. In the first chapter, I develop an equilibrium thermodynamics model to calculate equilibrium constants for isotope exchange reactions in small organic molecules. This equilibrium data provides a framework for interpreting the more complex data in the later chapters. In the second chapter, I demonstrate a method for measuring site-specific carbon isotopes in propane using high-resolution gas source mass spectrometry. This method relies on the characteristic fragments created during electron ionization, in which I measure the relative isotopic enrichment of separate parts of the molecule. My technique will be applied to a range of organic compounds in the future. For the third chapter, I use this technique to explore diffusion, mixing, and other natural processes in natural gas basins. As time progresses and the mixture matures, different components like kerogen and oil contribute to the propane in a natural gas sample. Each component imparts a distinct fingerprint on the site-specific isotope distribution within propane that I can observe to understand the source composition and maturation of the basin. Finally, in Chapter Four, I study the reaction kinetics of clumped isotopes in aragonite. Despite its frequent use as a clumped isotope thermometer, the aragonite blocking temperature is not known. Using laboratory heating experiments, I determine that the aragonite clumped isotope thermometer has a blocking temperature of 50-100°C. I compare this result to natural samples from the San Juan Islands that exhibit a maximum clumped isotope temperature that matches this blocking temperature. This thesis presents a framework for measuring site-specific carbon isotopes in organic molecules and new constraints on aragonite reaction kinetics. This study represents the foundation of a future generation of geochemical tools for the study of complex geologic systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The fetal and larval development of many freshwater fish is already relatively well covered. Coverage of the morphology of fish-species' eggs is very sparse. For this reason the authors have attempted to prepare a key on fish eggs which covers the bulk of German Teleostei fish. The key also includes a discussion of problems of categorization and terminology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The rapid growth and development of Los Angeles City and County has been one of the phenomena of the present age. The growth of a city from 50,600 to 576,000, an increase of over 1000% in thirty years is an unprecedented occurrence. It has given rise to a variety of problems of increasing magnitude.

Chief among these are: supply of food, water and shelter development of industry and markets, prevention and removal of downtown congestion and protection of life and property. These, of course, are the problems that any city must face. But in the case of a community which doubles its population every ten years, radical and heroic measures must often be taken.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A novel fiber Bragg grating temperature sensor is proposed and experimentally demonstrated with a long-period grating as a linear response edge filter to convert wavelength into intensity-encoded information for interrogation. The sensor is embedded into an aluminum substrate with a larger coefficient of thermal expansion to enhance its temperature sensitivity. A large dynamic range of 110 degreesC and a high resolution of 0.02 degreesC are obtained in the experiments. The technique can be used for multiplexed measurements with one broadband source and one long-period grating, and therefore is low Cost. (C) 2004 Society of PhotoOptical Instrumentation Engineers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The intent of this study is to provide formal apparatus which facilitates the investigation of problems in the methodology of science. The introduction contains several examples of such problems and motivates the subsequent formalism.

A general definition of a formal language is presented, and this definition is used to characterize an individual’s view of the world around him. A notion of empirical observation is developed which is independent of language. The interplay of formal language and observation is taken as the central theme. The process of science is conceived as the finding of that formal language that best expresses the available experimental evidence.

To characterize the manner in which a formal language imposes structure on its universe of discourse, the fundamental concepts of elements and states of a formal language are introduced. Using these, the notion of a basis for a formal language is developed as a collection of minimal states distinguishable within the language. The relation of these concepts to those of model theory is discussed.

An a priori probability defined on sets of observations is postulated as a reflection of an individual’s ontology. This probability, in conjunction with a formal language and a basis for that language, induces a subjective probability describing an individual’s conceptual view of admissible configurations of the universe. As a function of this subjective probability, and consequently of language, a measure of the informativeness of empirical observations is introduced and is shown to be intuitively plausible – particularly in the case of scientific experimentation.

The developed formalism is then systematically applied to the general problems presented in the introduction. The relationship of scientific theories to empirical observations is discussed and the need for certain tacit, unstatable knowledge is shown to be necessary to fully comprehend the meaning of realistic theories. The idea that many common concepts can be specified only by drawing on knowledge obtained from an infinite number of observations is presented, and the problems of reductionism are examined in this context.

A definition of when one formal language can be considered to be more expressive than another is presented, and the change in the informativeness of an observation as language changes is investigated. In this regard it is shown that the information inherent in an observation may decrease for a more expressive language.

The general problem of induction and its relation to the scientific method are discussed. Two hypotheses concerning an individual’s selection of an optimal language for a particular domain of discourse are presented and specific examples from the introduction are examined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The quasicontinuum (QC) method was introduced to coarse-grain crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. Though many QC formulations have been proposed with varying characteristics and capabilities, a crucial cornerstone of all QC techniques is the concept of summation rules, which attempt to efficiently approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of atoms. In this work we propose a novel, fully-nonlocal, energy-based formulation of the QC method with support for legacy and new summation rules through a general energy-sampling scheme. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. Within this structure, we introduce a new class of summation rules which leverage the affine kinematics of this QC formulation to most accurately integrate thermodynamic quantities of interest. By comparing this new class of summation rules to commonly-employed rules through analysis of energy and spurious force errors, we find that the new rules produce no residual or spurious force artifacts in the large-element limit under arbitrary affine deformation, while allowing us to seamlessly bridge to full atomistics. We verify that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors than all comparable previous summation rules through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions. Due to the unique structure of these summation rules, we also use the new formulation to study scenarios with large regions of free surface, a class of problems previously out of reach of the QC method. Lastly, we present the key components of a high-performance, distributed-memory realization of the new method, including a novel algorithm for supporting unparalleled levels of deformation. Overall, this new formulation and implementation allows us to efficiently perform simulations containing an unprecedented number of degrees of freedom with low approximation error.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[ES]Este Trabajo Fin de Grado está enmarcado dentro de un proyecto del grupo de investigación CompMech. El proyecto consiste en el diseño y construcción de un mecanismo de cinemática paralela para ensayos dinámicos. Este trabajo fin de grado engloba las tareas necesarias para el estudio del espacio de trabajo del mecanismo y la determinación de las dimensiones más apropiadas desde consideraciones cinemáticas. Se partirá de tres posibles mecanismos, de los que más adelante se seleccionará uno con el que terminar el ciclo de diseño. Para ello el primer paso es el análisis cinemático de los mecanismos. Se resolverán los problemas de posición y de velocidades, que serán los necesarios para el posterior estudio del espacio de trabajo. La resolución de estos problemas se programará en un programa Matlab. Después se obtendrán los espacios de trabajo de cada uno de los mecanismos, así como las posiciones singulares dentro del mismo, y su variación ante variaciones en las dimensiones. Será también de interés determinar las regiones del espacio de trabajo en las que más fácil es efectuar el movimiento del mecanismo. Conocidos los espacios de trabajo de cada mecanismo y su variabilidad con cambios en las dimensiones, se elegirá el mecanismo más apropiado para continuar con el ciclo de diseño. Para la elección se tendrán también en cuenta consideraciones adicionales aportadas por otros miembros del grupo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We design three-zone annular filters to be applied to optical storage system. The designed filters extend the depth of focus and realize transverse superresolution simultaneously, which will improve the performance of optical storage system greatly. And we propose two feasible schemes to improve imaging resolution of three-dimensional imaging system. One scheme depends on a complex filter formed by cascading of a three-zone phase filter and a three-zone amplitude filter. The complex filter converge the optimized transverse superresolution and the optimized axial superresolution of two different filters onto a single filter. It can improve the three-dimensional imaging performances greatly. Another scheme depends on a single three-zone complex filter. We propose a three-zone complex filter with phase shift 0.8 pi, which presents bigger design margin, better imaging quality and stronger three-dimensional superresolution capability. (c) 2006 Elsevier GmbH. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents methods for incrementally constructing controllers in the presence of uncertainty and nonlinear dynamics. The basic setting is motion planning subject to temporal logic specifications. Broadly, two categories of problems are treated. The first is reactive formal synthesis when so-called discrete abstractions are available. The fragment of linear-time temporal logic (LTL) known as GR(1) is used to express assumptions about an adversarial environment and requirements of the controller. Two problems of changes to a specification are posed that concern the two major aspects of GR(1): safety and liveness. Algorithms providing incremental updates to strategies are presented as solutions. In support of these, an annotation of strategies is developed that facilitates repeated modifications. A variety of properties are proven about it, including necessity of existence and sufficiency for a strategy to be winning. The second category of problems considered is non-reactive (open-loop) synthesis in the absence of a discrete abstraction. Instead, the presented stochastic optimization methods directly construct a control input sequence that achieves low cost and satisfies a LTL formula. Several relaxations are considered as heuristics to address the rarity of sampling trajectories that satisfy an LTL formula and demonstrated to improve convergence rates for Dubins car and single-integrators subject to a recurrence task.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A review of the theory of electron scattering indicates that low incident beam energies and large scattering angles are the favorable conditions for the observation of optically forbidden transitions in atoms and molecules.

An apparatus capable of yielding electron impact spectra at 90° with incident electron beam energies between 30 and 50 electron volts is described. The resolution of the instrument is about 1 electron volt.

Impact spectra of thirteen molecules have been obtained. Known forbidden transitions to the helium 23S, the hydrogen b3Ʃ+u, the nitrogen A3Ʃ+u, B3πg, a’πg, and C3πu, the carbon monoxide a3π, the ethylene ᾶ3B1u, and the benzene ᾶ3B1u states from the corresponding ground states have been observed.

In addition, singlet-triplet vertical transitions in acetylene, propyne, propadiene, norbornadiene and quadricyclene, peaking at 5.9, 5.9, 4.5, 3.8, and 4.0 ev (±0.2 ev), respectively, have been observed and assigned for the first time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

I. The 3.7 Å Crystal Structure of Horse Heart Ferricytochrome C.

The crystal structure of horse heart ferricytochrome c has been determined to a resolution of 3.7 Å using the multiple isomorphous replacement technique. Two isomorphous derivatives were used in the analysis, leading to a map with a mean figure of merit of 0.458. The quality of the resulting map was extremely high, even though the derivative data did not appear to be of high quality.

Although it was impossible to fit the known amino acid sequence to the calculated structure in an unambiguous way, many important features of the molecule could still be determined from the 3.7 Å electron density map. Among these was the fact that cytochrome c contains little or no α-helix. The polypeptide chain appears to be wound about the heme group in such a way as to form a loosely packed hydrophobic core in the molecule.

The heme group is located in a cleft on the molecule with one edge exposed to the solvent. The fifth coordinating ligand is His 18 and the sixth coordinating ligand is probably neither His 26 nor His 33.

The high resolution analysis of cytochrome c is now in progress and should be completed within the next year.

II. The Application of the Karle-Hauptman Tangent Formula to Protein Phasing.

The Karle-Hauptman tangent formula has been shown to be applicable to the refinement of previously determined protein phases. Tests were made with both the cytochrome c data from Part I and a theoretical structure based on the myoglobin molecule. The refinement process was found to be highly dependent upon the manner in which the tangent formula was applied. Iterative procedures did not work well, at least at low resolution.

The tangent formula worked very well in selecting the true phase from the two possible phase choices resulting from a single isomorphous replacement phase analysis. The only restriction on this application is that the heavy atoms form a non-centric cluster in the unit cell.

Pages 156 through 284 in this Thesis consist of previously published papers relating to the above two sections. References to these papers can be found on page 155.