940 resultados para HIGH TIDAL VOLUME
Resumo:
The spray zone is an important region to control nucleation of granules in a high shear granulator. In this study, a spray zone with cross flow is quantified as a well-mixed compartment in a high shear granulator. Granulation kinetics is quantitatively derived at both particle-scale and spray zone-scale. Two spatial decay rates, DGSDR (droplet-granule spatial decay rate) ζDG and DPSDR (droplet-primary particle spatial decay rate) ζDP, which are functions of volume fraction and diameter of particulate species within the powder bed, are defined to simplify the deduction. It is concluded that in cross flow, explicit analytical results show that the droplet concentration is subject to exponential decay with depth which produces a numerically infinite depth of spray zone in a real penetration process. In a well-mixed spray zone, the depth of the spray zone is 4/(ζDG + ζDP) and π2/3(ζDG + ζDP) in cuboid and cylinder shape, respectively. The first-order droplet-based collision rates of, nucleation rate B0 and rewetting rate RW0 are uncorrelated with the flow pattern and shape of the spray zone. The second-order droplet-based collision rate, nucleated granule-granule collision rate RGG, is correlated with the mixing pattern. Finally, a real formulation case of a high shear granulation process is used to estimate the size of the spray zone. The results show that the spray zone is a thin layer at the powder bed surface. We present, for the first time, the spray zone as a well-mixed compartment. The granulation kinetics of a well-mixed spray zone could be integrated into a Population Balance Model (PBM), particularly to aid development of a distributed model for product quality prediction.
Resumo:
The conventional, geometrically lumped description of the physical processes inside a high shear granulator is not reliable for process design and scale-up. In this study, a compartmental Population Balance Model (PBM) with spatial dependence is developed and validated in two lab-scale high shear granulation processes using a 1.9L MiPro granulator and 4L DIOSNA granulator. The compartmental structure is built using a heuristic approach based on computational fluid dynamics (CFD) analysis, which includes the overall flow pattern, velocity and solids concentration. The constant volume Monte Carlo approach is implemented to solve the multi-compartment population balance equations. Different spatial dependent mechanisms are included in the compartmental PBM to describe granule growth. It is concluded that for both cases (low and high liquid content), the adjustment of parameters (e.g. layering, coalescence and breakage rate) can provide a quantitative prediction of the granulation process.
Resumo:
The need to incorporate advanced engineering tools in biology, biochemistry and medicine is in great demand. Many of the existing instruments and tools are usually expensive and require special facilities.^ With the advent of nanotechnology in the past decade, new approaches to develop devices and tools have been generated by academia and industry. ^ One such technology, NMR spectroscopy, has been used by biochemists for more than 2 decades to study the molecular structure of chemical compounds. However, NMR spectrometers are very expensive and require special laboratory rooms for their proper operation. High magnetic fields with strengths in the order of several Tesla make these instruments unaffordable to most research groups.^ This doctoral research proposes a new technology to develop NMR spectrometers that can operate at field strengths of less than 0.5 Tesla using an inexpensive permanent magnet and spin dependent nanoscale magnetic devices. This portable NMR system is intended to analyze samples as small as a few nanoliters.^ The main problem to resolve when downscaling the variables is to obtain an NMR signal with high Signal-To-Noise-Ratio (SNR). A special Tunneling Magneto-Resistive (TMR) sensor design was developed to achieve this goal. The minimum specifications for each component of the proposed NMR system were established. A complete NMR system was designed based on these minimum requirements. The goat was always to find cost effective realistic components. The novel design of the NMR system uses technologies such as Direct Digital Synthesis (DDS), Digital Signal Processing (DSP) and a special Backpropagation Neural Network that finds the best match of the NMR spectrum. The system was designed, calculated and simulated with excellent results.^ In addition, a general method to design TMR Sensors was developed. The technique was automated and a computer program was written to help the designer perform this task interactively.^
Resumo:
Three-Dimensional (3-D) imaging is vital in computer-assisted surgical planning including minimal invasive surgery, targeted drug delivery, and tumor resection. Selective Internal Radiation Therapy (SIRT) is a liver directed radiation therapy for the treatment of liver cancer. Accurate calculation of anatomical liver and tumor volumes are essential for the determination of the tumor to normal liver ratio and for the calculation of the dose of Y-90 microspheres that will result in high concentration of the radiation in the tumor region as compared to nearby healthy tissue. Present manual techniques for segmentation of the liver from Computed Tomography (CT) tend to be tedious and greatly dependent on the skill of the technician/doctor performing the task. ^ This dissertation presents the development and implementation of a fully integrated algorithm for 3-D liver and tumor segmentation from tri-phase CT that yield highly accurate estimations of the respective volumes of the liver and tumor(s). The algorithm as designed requires minimal human intervention without compromising the accuracy of the segmentation results. Embedded within this algorithm is an effective method for extracting blood vessels that feed the tumor(s) in order to plan effectively the appropriate treatment. ^ Segmentation of the liver led to an accuracy in excess of 95% in estimating liver volumes in 20 datasets in comparison to the manual gold standard volumes. In a similar comparison, tumor segmentation exhibited an accuracy of 86% in estimating tumor(s) volume(s). Qualitative results of the blood vessel segmentation algorithm demonstrated the effectiveness of the algorithm in extracting and rendering the vasculature structure of the liver. Results of the parallel computing process, using a single workstation, showed a 78% gain. Also, statistical analysis carried out to determine if the manual initialization has any impact on the accuracy showed user initialization independence in the results. ^ The dissertation thus provides a complete 3-D solution towards liver cancer treatment planning with the opportunity to extract, visualize and quantify the needed statistics for liver cancer treatment. Since SIRT requires highly accurate calculation of the liver and tumor volumes, this new method provides an effective and computationally efficient process required of such challenging clinical requirements.^
Resumo:
Transition metals (Ti, Zr, Hf, Mo, W, V, Nb, Ta, Pd, Pt, Cu, Ag, and Au) are essential building units of many materials and have important industrial applications. Therefore, it is important to understand their thermal and physical behavior when they are subjected to extreme conditions of pressure and temperature. This dissertation presents: • An improved experimental technique to use lasers for the measurement of thermal conductivity of materials under conditions of very high pressure (P, up to 50 GPa) and temperature (T up to 2500 K). • An experimental study of the phase relationship and physical properties of selected transition metals, which revealed new and unexpected physical effects of thermal conductivity in Zr, and Hf under high P-T. • New phase diagrams created for Hf, Ti and Zr from experimental data. • P-T dependence of the lattice parameters in α-hafnium. Contrary to prior reports, the α-ω phase transition in hafnium has a negative dT/dP slope. • New data on thermodynamic and physical properties of several transition metals and their respective high P-T phase diagrams. • First complete thermodynamic database for solid phases of 13 common transition metals was created. This database has: All the thermochemical data on these elements in their standard state (mostly available and compiled); All the equations of state (EoS) formulated from pressure-volume-temperature data (measured as a part of this study and from literature); Complete thermodynamic data for selected elements from standard to extreme conditions. The thermodynamic database provided by this study can be used with available thermodynamic software to calculate all thermophysical properties and phase diagrams at high P-T conditions. For readers who do not have access to this software, tabulated values of all thermodynamic and volume data for the 13 metals at high P-T are included in the APPENDIX. In the APPENDIX, a description of several other high-pressure studies of selected oxide systems is also included. Thermophysical properties (Cp, H, S, G) of the high P-T ω-phase of Ti, Zr and Hf were determined during the optimization of the EoS parameters and are presented in this study for the first time. These results should have important implications in understanding hexagonal-close-packed to simple-hexagonal phase transitions in transition metals and other materials.
Resumo:
Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (∼1 km2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999–2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI,http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999–2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory during the dry season where surface flow in the area is confined to the Taylor River channel. The model also provided guidance on the importance of capturing the overland flow component, which enters the area as sheet flow during the rainy season. Overall, the modeling approach is suitable to reach better understanding of the water budget in the mangrove region. However, more detailed field data is needed to ascertain model predictions by further calibrating overland flow parameters.
Resumo:
This article documents all major articles in the FIU Hospitality Review, from its inaugural issue in spring of 1983 through 2001; 346 articles and 325 authors from 127 affiliations are included, as well as the academic institutions, hospitality industry organizations and authors who have contributed most frequently. The high ranking received by the FIU Hospitality Review is evidence of the many researchers and industry executives who have contributed over the past two decades.
Resumo:
Mangrove forests are ecosystems susceptible to changing water levels and temperatures due to climate change as well as perturbations resulting from tropical storms. Numerical models can be used to project mangrove forest responses to regional and global environmental changes, and the reliability of these models depends on surface energy balance closure. However, for tidal ecosystems, the surface energy balance is complex because the energy transport associated with tidal activity remains poorly understood. This study aimed to quantify impacts of tidal flows on energy dynamics within a mangrove ecosystem. To address the research objective, an intensive 10-day study was conducted in a mangrove forest located along the Shark River in the Everglades National Park, FL, USA. Forest–atmosphere turbulent exchanges of energy were quantified with an eddy covariance system installed on a 30-m-tall flux tower. Energy transport associated with tidal activity was calculated based on a coupled mass and energy balance approach. The mass balance included tidal flows and accumulation of water on the forest floor. The energy balance included temporal changes in enthalpy, resulting from tidal flows and temperature changes in the water column. By serving as a net sink or a source of available energy, flood waters reduced the impact of high radiational loads on the mangrove forest. Also, the regression slope of available energy versus sink terms increased from 0.730 to 0.754 and from 0.798 to 0.857, including total enthalpy change in the water column in the surface energy balance for 30-min periods and daily daytime sums, respectively. Results indicated that tidal inundation provides an important mechanism for heat removal and that tidal exchange should be considered in surface energy budgets of coastal ecosystems. Results also demonstrated the importance of including tidal energy advection in mangrove biophysical models that are used for predicting ecosystem response to changing climate and regional freshwater management practices.
Resumo:
Mapping of vegetation patterns over large extents using remote sensing methods requires field sample collections for two different purposes: (1) the establishment of plant association classification systems from samples of relative abundance estimates; and (2) training for supervised image classification and accuracy assessment of satellite data derived maps. One challenge for both procedures is the establishment of confidence in results and the analysis across multiple spatial scales. Continuous data sets that enable cross-scale studies are very time consuming and expensive to acquire and such extensive field sampling can be invasive. The use of high resolution aerial photography (hrAP) offers an alternative to extensive, invasive, field sampling and can provide large volume, spatially continuous, reference information that can meet the challenges of confidence building and multi-scale analysis.
Resumo:
The Last Interglacial (LIG, 129-116 thousand of years BP, ka) represents a test bed for climate model feedbacks in warmer-than-present high latitude regions. However, mainly because aligning different palaeoclimatic archives and from different parts of the world is not trivial, a spatio-temporal picture of LIG temperature changes is difficult to obtain. Here, we have selected 47 polar ice core and sub-polar marine sediment records and developed a strategy to align them onto the recent AICC2012 ice core chronology. We provide the first compilation of high-latitude temperature changes across the LIG associated with a coherent temporal framework built between ice core and marine sediment records. Our new data synthesis highlights non-synchronous maximum temperature changes between the two hemispheres with the Southern Ocean and Antarctica records showing an early warming compared to North Atlantic records. We also observe warmer than present-day conditions that occur for a longer time period in southern high latitudes than in northern high latitudes. Finally, the amplitude of temperature changes at high northern latitudes is larger compared to high southern latitude temperature changes recorded at the onset and the demise of the LIG. We have also compiled four data-based time slices with temperature anomalies (compared to present-day conditions) at 115 ka, 120 ka, 125 ka and 130 ka and quantitatively estimated temperature uncertainties that include relative dating errors. This provides an improved benchmark for performing more robust model-data comparison. The surface temperature simulated by two General Circulation Models (CCSM3 and HadCM3) for 130 ka and 125 ka is compared to the corresponding time slice data synthesis. This comparison shows that the models predict warmer than present conditions earlier than documented in the North Atlantic, while neither model is able to produce the reconstructed early Southern Ocean and Antarctic warming. Our results highlight the importance of producing a sequence of time slices rather than one single time slice averaging the LIG climate conditions.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
This collection of 359 data sets represents raw data of physical properties measurements on Polarstern sediment cores from both polar oceans, sampled and measured between 1985 and 1995.
Resumo:
Radiocarbon stratigraphy is an essential tool for high resolution paleoceanographic studies. Age models based on radiocarbon ages of foraminifera are commonly applied to a wide range of geochemical studies, including the investigation of temporal leads and lags. The critical assumption is that temporal coupling between foraminifera and other sediment constituents, including specific molecular organic compounds (biomarkers) of marine phytoplankton, e.g. alkenones, is maintained in the sediments. To test this critical assumption in the Benguela upwelling area, we have determined radiocarbon ages of total C37-C39 alkenones in 20 samples from two gravity cores and three multicorer cores. The cores were retrieved from the continental shelf and slope off Namibia, and samples were taken from Holocene, deglacial and Last Glacial Maximum core sections. The alkenone radiocarbon ages were compared to those of planktic foraminifera, total organic carbon, fatty acids and fine grained carbonates from the same samples. Interestingly, the ages of alkenones were 1000 to 4500 yr older than those of foraminifera in all samples. Such age differences may be the result of different processes: Bioturbation associated with grain size effects, lateral advection of (recycled) material and redeposition of sediment on upper continental slopes due to currents or tidal movement are examples for such processes. Based on the results of this study, the age offsets between foraminifera and alkenones in sediments from the upper continental slope off Namibia most probably do not result from particle-selective bioturbation processes. Resuspension of organic particles in response to tidal movement of bottom waters with velocities up to 25 cm/s recorded near the core sites is the more likely explanation. Our results imply that age control established using radiocarbon measurements of foraminifera may be inadequate for the interpretation of alkenone-based proxy data. Observed temporal leads and lags between foraminifera based data and data derived from alkenone measurements may therefore be secondary signals, i.e. the result of processes associated with particle settling and biological activity.
Resumo:
Measurement and verification of products and processes during the early design is attracting increasing interest from high value manufacturing industries. Measurement planning is deemed as an effective means to facilitate the integration of the metrology activity into a wider range of production processes. However, the literature reveals that there are very few research efforts in this field, especially regarding large volume metrology. This paper presents a novel approach to accomplish instruments selection, the first stage of measurement planning process, by mapping measurability characteristics between specific measurement assignments and instruments.
Resumo:
Hydrogen has been called the fuel of the future, and as it’s non- renewable counterparts become scarce the economic viability of hydrogen gains traction. The potential of hydrogen is marked by its high mass specific energy density and wide applicability as a fuel in fuel cell vehicles and homes. However hydrogen’s volume must be reduced via pressurization or liquefaction in order to make it more transportable and volume efficient. Currently the vast majority of industrially produced hydrogen comes from steam reforming of natural gas. This practice yields low-pressure gas which must then be compressed at considerable cost and uses fossil fuels as a feedstock leaving behind harmful CO and CO2 gases as a by-product. The second method used by industry to produce hydrogen gas is low pressure electrolysis. In comparison the electrolysis of water at low pressure can produce pure hydrogen and oxygen gas with no harmful by-products using only water as a feedstock, but it will still need to be compressed before use. Multiple theoretical works agree that high pressure electrolysis could reduce the energy losses due to product gas compression. However these works openly admit that their projected gains are purely theoretical and ignore the practical limitations and resistances of a real life high pressure system. The goal of this work is to experimentally confirm the proposed thermodynamic gains of ultra-high pressure electrolysis in alkaline solution and characterize the behavior of a real life high pressure system.