243 resultados para Shrinking


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This presentation will show how a grassroots initiative has budded into the Florida International University (FIU) Libraries being an instrumental part of online learning. It will describe some of the marketing and outreach efforts that have been successful and share ideas on how to build alliances and networks with online faculty and students. Along with outreach efforts, the presentation will demonstrate some of the successful tools used to meet the needs of online students. Some of the these tools include becoming embedded in courses, building course and program specific Libguides, using Adobe Connect to reach students, creating simple YouTube videos, and creating more professional videos with FIU Online. The presentation will conclude with sharing some tips on how to keep the workload manageable when distance-learning programs are growing at the same time as library budgets and resources are shrinking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Germanium was of great interest in the 1950’s when it was used for the first transistor device. However, due to the water soluble and unstable oxide it was surpassed by silicon. Today, as device dimensions are shrinking the silicon oxide is no longer suitable due to gate leakage and other low-κ dielectrics such as Al2O3 and HfO2 are being used. Germanium (Ge) is a promising material to replace or integrate with silicon (Si) to continue the trend of Moore’s law. Germanium has better intrinsic mobilities than silicon and is also silicon fab compatible so it would be an ideal material choice to integrate into silicon-based technologies. The progression towards nanoelectronics requires a lot of in depth studies. Dynamic TEM studies allow observations of reactions to allow a better understanding of mechanisms and how an external stimulus may affect a material/structure. This thesis details in situ TEM experiments to investigate some essential processes for germanium nanowire (NW) integration into nanoelectronic devices; i.e. doping and Ohmic contact formation. Chapter 1 reviews recent advances in dynamic TEM studies on semiconductor (namely silicon and germanium) nanostructures. The areas included are nanowire/crystal growth, germanide/silicide formation, irradiation, electrical biasing, batteries and strain. Chapter 2 details the study of ion irradiation and the damage incurred in germanium nanowires. An experimental set-up is described to allow for concurrent observation in the TEM of a nanowire following sequential ion implantation steps. Grown nanowires were deposited on a FIB labelled SiN membrane grid which facilitated HRTEM imaging and facile navigation to a specific nanowire. Cross sections of irradiated nanowires were also performed to evaluate the damage across the nanowire diameter. Experiments were conducted at 30 kV and 5 kV ion energies to study the effect of beam energy on nanowires of varied diameters. The results on nanowires were also compared to the damage profile in bulk germanium with both 30 kV and 5 kV ion beam energies. Chapter 3 extends the work from chapter 2 whereby nanowires are annealed post ion irradiation. In situ thermal annealing experiments were conducted to observe the recrystallization of the nanowires. A method to promote solid phase epitaxial growth is investigated by irradiating only small areas of a nanowire to maintain a seed from which the epitaxial growth can initiate. It was also found that strain in the nanowire greatly effects defect formation and random nucleation and growth. To obtain full recovery of the crystal structure of a nanowire, a stable support which reduces strain in the nanowire is essential as well as containing a seed from which solid phase epitaxial growth can initiate. Chapter 4 details the study of nickel germanide formation in germanium nanostructures. Rows of EBL (electron beam lithography) defined Ni-capped germanium nanopillars were extracted in FIB cross sections and annealed in situ to observe the germanide formation. Chapter 5 summarizes the key conclusions of each chapter and discusses an outlook on the future of germanium nanowire studies to facilitate their future incorporation into nanodevices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on detailed reconstructions of global distribution patterns, both paleoproductivity and the benthic d13C record of CO2, which is dissolved in the deep ocean, strongly differed between the Last Glacial Maximum and the Holocene. With the onset of Termination I about 15,000 years ago, the new (export) production of low- and mid-latitude upwelling cells started to decline by more than 2-4 Gt carbon/year. This reduction is regarded as a main factor leading to both the simultaneous rise in atmospheric CO2 as recorded in ice cores and, with a slight delay of more than 1000 years, to a large-scale gradual CO2 depletion of the deep ocean by about 650 Gt C. This estimate is based on an average increase in benthic d13C by 0.4-0.5 per mil. The decrease in new production also matches a clear 13C depletion of organic matter, possibly recording an end of extreme nutrient utilization in upwelling cells. As shown by Sarnthein et al., [1987], the productivity reversal appears to be triggered by a rapid reduction in the strength of meridional trades, which in turn was linked via a shrinking extent of sea ice to a massive increase in high-latitude insolation, i.e., to orbital forcing as primary cause.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To the apparent surprise of policy makers at the provincial and school board levels, Ontario’s public schools are about to experience a massive exodus of principals and vice principals. This report, funded by a grant from the Ontario Principals’ Council, details the scale of the retirement wave currently hitting Ontario’s public school boards. Data collected from 946 practicing school administrators suggest that the retirement rates will be almost 20 per cent higher than provincial estimates. Anecdotal evidence suggests that the pool of qualified candidates for these positions is also shrinking. Already, fewer individuals are applying for each available vacancy. The study examines the major dissatisfiers in the current role of school principal as experienced by incumbents. Interviews were also conducted with 92 individuals identified as exceptional candidates for the principalship who had opted not to follow that career path in order to determine what factors they found most important in their decision making. The report concludes with recommendations for the province, school boards and principals' organizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to analyse the state of the investigative journalism in Mexico, especially the one that is practiced at the local level in the provinces. That is, this research is based upon a case study conducted in Morelia, the capital city of the state of Michoacán. The empirical evidence will show that there is an evident divergence regarding the practice of the investigative journalism: on the one hand, journalists are aware of what this concept involves and they consider that they practice it on a regular basis; but, on the other, the content analysis prove otherwise. In other words, the account of what is actually printed significantly differs from the news workers’ perceptions, because the former shows a poorly developed journalistic investigation practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present self-consistent, axisymmetric core-collapse supernova simulations performed with the Prometheus-Vertex code for 18 pre-supernova models in the range of 11–28 M ⊙, including progenitors recently investigated by other groups. All models develop explosions, but depending on the progenitor structure, they can be divided into two classes. With a steep density decline at the Si/Si–O interface, the arrival of this interface at the shock front leads to a sudden drop of the mass-accretion rate, triggering a rapid approach to explosion. With a more gradually decreasing accretion rate, it takes longer for the neutrino heating to overcome the accretion ram pressure and explosions set in later. Early explosions are facilitated by high mass-accretion rates after bounce and correspondingly high neutrino luminosities combined with a pronounced drop of the accretion rate and ram pressure at the Si/Si–O interface. Because of rapidly shrinking neutron star radii and receding shock fronts after the passage through their maxima, our models exhibit short advection timescales, which favor the efficient growth of the standing accretion-shock instability. The latter plays a supportive role at least for the initiation of the re-expansion of the stalled shock before runaway. Taking into account the effects of turbulent pressure in the gain layer, we derive a generalized condition for the critical neutrino luminosity that captures the explosion behavior of all models very well. We validate the robustness of our findings by testing the influence of stochasticity, numerical resolution, and approximations in some aspects of the microphysics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internal curing is a relatively new technique being used to promote hydration of Portland cement concretes. The fundamental concept is to provide reservoirs of water within the matrix such that the water does not increase the initial water/cementitious materials ratio to the mixture, but is available to help continue hydration once the system starts to dry out. The reservoirs used in the US are typically in the form of lightweight fine aggregate (LWFA) that is saturated prior to batching. Considerable work has been conducted both in the laboratory and in the field to confirm that this approach is fundamentally sound and yet practical for construction purposes. A number of bridge decks have been successfully constructed around the US, including one in Iowa in 2013. It is reported that inclusion of about 20% to 30% LWFA will not only improve strength development and potential durability, but, more importantly, will significantly reduce shrinking, thus reducing cracking risk. The aim of this work was to investigate the feasibility of such an approach in a bridge deck.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrometallurgical process modeling is the main objective of this Master’s thesis work. Three different leaching processes namely, high pressure pyrite oxidation, direct oxidation zinc concentrate (sphalerite) leaching and gold chloride leaching using rotating disc electrode (RDE) are modeled and simulated using gPROMS process simulation program in order to evaluate its model building capabilities. The leaching mechanism in each case is described in terms of a shrinking core model. The mathematical modeling carried out included process model development based on available literature, estimation of reaction kinetic parameters and assessment of the model reliability by checking the goodness fit and checking the cross correlation between the estimated parameters through the use of correlation matrices. The estimated parameter values in each case were compared with those obtained using the Modest simulation program. Further, based on the estimated reaction kinetic parameters, reactor simulation and modeling for direct oxidation zinc concentrate (sphalerite) leaching is carried out in Aspen Plus V8.6. The zinc leaching autoclave is based on Cominco reactor configuration and is modeled as a series of continuous stirred reactors (CSTRs). The sphalerite conversion is calculated and a sensitivity analysis is carried out so to determine the optimum reactor operation temperature and optimum oxygen mass flow rate. In this way, the implementation of reaction kinetic models into the process flowsheet simulation environment has been demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reaction of post-consumer poly(ethylene terephthalate) with aqueous solutions of sulfuric acid 7.5M was investigated in terms of temperature, time and particle size. The reaction extent reached 80% in four days at 100 degrees C and 90% in 5 hours at 135 degrees C. TPA obtained was purified and considered in the same level of quality of the commercial one after tests of elemental analysis, particle size and color. It was concluded that the hydrolysis occurred preferentially at the chain ends and superficially, having as controller mechanism the acid diffusion into the polymer structure. The shrinking-core model can explain the reaction kinetics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabajo realizado en la empresa CAF Power&Automation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents an investigation on endoscopic optical coherence tomography (OCT). As a noninvasive imaging modality, OCT emerges as an increasingly important diagnostic tool for many clinical applications. Despite of many of its merits, such as high resolution and depth resolvability, a major limitation is the relatively shallow penetration depth in tissue (about 2∼3 mm). This is mainly due to tissue scattering and absorption. To overcome this limitation, people have been developing many different endoscopic OCT systems. By utilizing a minimally invasive endoscope, the OCT probing beam can be brought to the close vicinity of the tissue of interest and bypass the scattering of intervening tissues so that it can collect the reflected light signal from desired depth and provide a clear image representing the physiological structure of the region, which can not be disclosed by traditional OCT. In this thesis, three endoscope designs have been studied. While they rely on vastly different principles, they all converge to solve this long-standing problem.

A hand-held endoscope with manual scanning is first explored. When a user is holding a hand- held endoscope to examine samples, the movement of the device provides a natural scanning. We proposed and implemented an optical tracking system to estimate and record the trajectory of the device. By registering the OCT axial scan with the spatial information obtained from the tracking system, one can use this system to simply ‘paint’ a desired volume and get any arbitrary scanning pattern by manually waving the endoscope over the region of interest. The accuracy of the tracking system was measured to be about 10 microns, which is comparable to the lateral resolution of most OCT system. Targeted phantom sample and biological samples were manually scanned and the reconstructed images verified the method.

Next, we investigated a mechanical way to steer the beam in an OCT endoscope, which is termed as Paired-angle-rotation scanning (PARS). This concept was proposed by my colleague and we further developed this technology by enhancing the longevity of the device, reducing the diameter of the probe, and shrinking down the form factor of the hand-piece. Several families of probes have been designed and fabricated with various optical performances. They have been applied to different applications, including the collector channel examination for glaucoma stent implantation, and vitreous remnant detection during live animal vitrectomy.

Lastly a novel non-moving scanning method has been devised. This approach is based on the EO effect of a KTN crystal. With Ohmic contact of the electrodes, the KTN crystal can exhibit a special mode of EO effect, termed as space-charge-controlled electro-optic effect, where the carrier electron will be injected into the material via the Ohmic contact. By applying a high voltage across the material, a linear phase profile can be built under this mode, which in turn deflects the light beam passing through. We constructed a relay telescope to adapt the KTN deflector into a bench top OCT scanning system. One of major technical challenges for this system is the strong chromatic dispersion of KTN crystal within the wavelength band of OCT system. We investigated its impact on the acquired OCT images and proposed a new approach to estimate and compensate the actual dispersion. Comparing with traditional methods, the new method is more computational efficient and accurate. Some biological samples were scanned by this KTN based system. The acquired images justified the feasibility of the usage of this system into a endoscopy setting. My research above all aims to provide solutions to implement an OCT endoscope. As technology evolves from manual, to mechanical, and to electrical approaches, different solutions are presented. Since all have their own advantages and disadvantages, one has to determine the actual requirements and select the best fit for a specific application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The final publication is available at Springer via http://dx.doi.org/[10.1007/s10853-015-9458-2]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Arquitectura, apresentada na Universidade de Lisboa - Faculdade de Arquitectura.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.