987 resultados para Maximum and minimum air temperature
Resumo:
The aim of this study was to test the hypothesis that differences in the pattern of seasonal growth in foliose lichens from year to year were determined by yearly differences in the distribution of rainfall, shortwave radiation and temperature. Hence, the radial growth of Parmelia conspersa (Ehrh. Ex Ach.) Ach. , P. glabratula ssp. fuliginosa (Fr. ex Duby) Laund. and Physcia orbicularis (Neck) Poetsch. was studied on slate fragments over 34 successive months in an area of South Gwynedd, Wales. U.K. Similarities and differences were observed in the pattern of seasonal growth in the three species. Periods of maximum growth of a species occurred in different seasons in successive years. Correlation and multiple regression analysis suggested that total rainfall per month was the most important climatic variable positively correlated with monthly growth. Significant positive correlations were found in some growth periods with number of raindays per month, average wind speed and maximum and minimum temperature. Total number of sunshine hours per month and the frequency of ground frosts were negatively correlated with monthly growth in some growth periods. For each species, monthly radial growth was correlated with different climatic variables in each growth period. Hence, the results support the hypothesis in that periods of maximum growth can occur in any season in South Gwynedd and depend on (1) the distribution of periods of high total rainfall and (2) whether or not these periods coincide with periods of maximum sunlight.
Resumo:
An apparatus was designed and constructed which enabled material to be melted and heated to a maximum temperature of 1000C and then flooded with a pre-heated liquid. A series of experiments to investigate the thermal interaction between molten metals (aluminium, lead and tin) and sub-cooled water were conducted. The cooling rates of the molten materials under conditions of flooding were measured with a high speed-thermocouple and recorded with a transient recorder. A simplified model for calculating heat fluxes and metal surface temperatures was developed and used. Experimental results yielded boiling heat transfer in the transition film and stable film regions of the classic boiling curve. Maximum and minimum heat fluxes were observed at nucleate boiling crisis and the Leidenfrost point respectively. Results indicate that heat transfer from molten metals to sub-cooled water is a function of temperature and coolant depth and not a direct function of the physical properties of the metals. Heat transfer in the unstable transition film boiling region suggests that boiling dynamics in this region where a stationary molten metal is under pool boiling conditions at atmospheric pressure would not initiate a fuel-coolant interaction. Low heat fluxes around the Leidenfrost point would provide efficient fuel-coolant decoupling by a stable vapour blanket to enable coarse mixing of the fuel and coolant to occur without appreciable loss of thermal energy from the fuel. The research was conducted by Gareph Boxley and was submitted for the degree of PhD at the University of Aston in Birmingham in 1980.
Resumo:
Aim of the work is the implementation of a low temperature reforming (LT reforming) unit downstream the Haloclean pyrolyser in order to enhance the heating value of the pyrolysis gas. Outside the focus of this work was to gain a synthesis gas quality for further use. Temperatures between 400 °C and 500 °C were applied. A commercial pre-reforming catalyst on a nickel basis from Südchemie was chosen for LT reforming. As biogenic feedstock wheat straw has been used. Pyrolysis of wheat straw at 450 °C by means of Haloclean pyrolysis leads to 28% of char, 50% of condensate and 22% of gas. The condensate separates in a water phase and an organic phase. The organic phase is liquid, but contains viscous compounds. These compounds could underlay aging and could lead to solid tars which can cause post processing problems. Therefore, the implementation of a catalytic reformer is not only of interest from an energetic point of view, it is generally interesting for tar conversion purposes after pyrolysis applications. By using a fixed bed reforming unit at 450–490 °C and space velocities about 3000 l/h the pyrolysis gas volume flow could be increased to about 58%. This corresponds to a decrease of the yields of condensates by means of catalysis up to 17%, the yield of char remains unchanged, since pyrolysis conditions are the same. The heating value in the pyrolysis gas could be increased by the factor of 1.64. Hydrogen concentrations up to 14% could be realised.
Resumo:
Objectives - Powdered and granulated particulate materials make up most of the ingredients of pharmaceuticals and are often at risk of undergoing unwanted agglomeration, or caking, during transport or storage. This is particularly acute when bulk powders are exposed to extreme swings in temperature and relative humidity, which is now common as drugs are produced and administered in increasingly hostile climates and are stored for longer periods of time prior to use. This study explores the possibility of using a uniaxial unconfined compression test to compare the strength of caked agglomerates exposed to different temperatures and relative humidities. This is part of a longer-term study to construct a protocol to predict the caking tendency of a new bulk material from individual particle properties. The main challenge is to develop techniques that provide repeatable results yet are presented simply enough to be useful to a wide range of industries. Methods - Powdered sucrose, a major pharmaceutical ingredient, was poured into a split die and exposed to high and low relative humidity cycles at room temperature. The typical ranges were 20–30% for the lower value and 70–80% for the higher value. The outer die casing was then removed and the resultant agglomerate was subjected to an unconfined compression test using a plunger fitted to a Zwick compression tester. The force against displacement was logged so that the dynamics of failure as well as the failure load of the sample could be recorded. The experimental matrix included varying the number of cycles, the amount between the maximum and minimum relative humidity, the height and diameters of the samples, the number of cycles and the particle size. Results - Trends showed that the tensile strength of the agglomerates increased with the number of cycles and also with the more extreme swings in relative humidity. This agrees with previous work on alternative methods of measuring the tensile strength of sugar agglomerates formed from humidity cycling (Leaper et al 2003). Conclusions - The results show that at the very least the uniaxial tester is a good comparative tester to examine the caking tendency of powdered materials, with a simple arrangement and operation that are compatible with the requirements of industry. However, further work is required to continue to optimize the height/ diameter ratio during tests.
Resumo:
The inverse problem of determining a spacewise dependent heat source, together with the initial temperature for the parabolic heat equation, using the usual conditions of the direct problem and information from two supplementary temperature measurements at different instants of time is studied. These spacewise dependent temperature measurements ensure that this inverse problem has a unique solution, despite the solution being unstable, hence the problem is ill-posed. We propose an iterative algorithm for the stable reconstruction of both the initial data and the source based on a sequence of well-posed direct problems for the parabolic heat equation, which are solved at each iteration step using the boundary element method. The instability is overcome by stopping the iterations at the first iteration for which the discrepancy principle is satisfied. Numerical results are presented for a typical benchmark test example, which has the input measured data perturbed by increasing amounts of random noise. The numerical results show that the proposed procedure gives accurate numerical approximations in relatively few iterations.
Resumo:
PURPOSE. To examine the relation between ocular surface temperature (OST) assessed by dynamic thermal imaging and physical parameters of the anterior eye in normal subjects. METHODS. Dynamic ocular thermography (ThermoTracer 7102MX) was used to record body temperature and continuous ocular surface temperature for 8 s after a blink in the right eyes of 25 subjects. Corneal thickness, corneal curvature, and anterior chamber depth (ACD) were assessed using Orbscan II; noninvasive tear break-up time (NIBUT) was assessed using the tearscope; slit lamp photography was used to record tear meniscus height (TMH) and objective bulbar redness. RESULTS. Initial OST after a blink was significantly correlated only with body temperature (r = 0.80, p < 0.0005), NIBUT (r = -0.68, p < 0.005) and corneal curvature (r = -0.40, p = 0.05). A regression model containing all the variables accounted for 70% (p = 0.002) of the variance in OST, of which NIBUT (29%, p = 0.004), and body temperature (18%, p = 0.005) contributed significantly. CONCLUSIONS. The results support previous theoretical models that OST radiation is principally related to the tear film; and demonstrate that it is less related to other characteristics such as corneal thickness, corneal curvature, and anterior chamber depth. © 2007 American Academy of Optometry.
Resumo:
Internal Quantum Efficiency (IQE) of two-colour monolithic white light emitting diode (LED) was measured by temperature dependant electro-luminescence (TDEL) and analysed with modified rate equation based on ABC model. External, internal and injection efficiencies of blue and green quantum wells were analysed separately. Monolithic white LED contained one green InGaN QW and two blue QWs being separated by GaN barrier. This paper reports also the tunable behaviour of correlated colour temperature (CCT) in pulsed operation mode and effect of self-heating on device performance. © 2014 SPIE.
Resumo:
The government has reinvested in Air New Zealand only a fraction of the present value of what it received for the airline in 1989, argues ALAN LOWE
Humanitarian charter and minimum standards in humanitarian response. The Sphere Project [Annotation]
Resumo:
The cactus pear has become over the years an important forage alternative for brazilian semiarid region, especially during long periods of drought. Despite its importance for agriculture, its cultivation has dispensed basic crop practices and fundamentals technical-scientific basis about its climatic requirements. Thus, the main objective of this study was to elaborate the agroclimatic zoning of cactus pear (Opuntia sp.) for the state of Paraíba. The agroclimatic zoning of cactus pear was based on climatic indicators outlined in the literature and climatological data of precipitation and temperature (mean, maximum, and minimum) from 97 locations in the state of Paraíba. According to the results, the region of ‘Borborema’ is the most favorable for the cultivation of cactus pear. The regions of ‘Agreste’, ‘Sertão’, and coastal part of Litoral may be used but with restrictions. However, the cultivation of cactus pear is recommended throughout the state of Paraíba, except the coastal part of the Litoral and the region around Areia. In both cases, the inability is due to excessive precipitation.
Resumo:
This data sets contains LPJ-LMfire dynamic global vegetation model output covering Europe and the Mediterranean for the Last Glacial Maximum (LGM; 21 ka) and for a preindustrial control simulation (20th century detrended climate). The netCDF data files are time averages of the final 30 years of the model simulation. Each netCDF file contains four or five variables: fractional cover of 9 plant functional types (PFTs; cover), total fractional coverage of trees (treecover), population density of hunter-gatherers (foragerPD; only for the "people" simulations), fraction of the gridcell burned on 30-year average (burnedf), and vegetation net primary productivity (NPP). The model spatial resolution is 0.5-degrees For the LGM simulations, LPJ-LMfire was driven by the PMIP3 suite of eight GCMs for which LGM climate simulations were available. Also provided in this archive is the result of an LPJ-LMfire run that was forced by the average climate of all GCMs (the "GCM-mean" files), and the average of each of the individual LPJ-LMfire runs over the eight LGM scenarios individually (the "LPJ-mean" files). The model simulations are provided that include the influence of human presence on the landscape (the "people" files), and in a "world without humans" scenario (the "natural" files). Finally this archive contains the preindustrial reference simulation with and without human influence ("PI_reference_people" and "PI_reference_nat", respectively). There are therefore 22 netCDF files in this archive: 8 each of LGM simulations with and without people (total 16) and the "GCM mean" simulation (2 files) and the "LPJ mean" aggregate (2 files), and finally the two preindustrial "control" simulations ("PI"), with and without humans (2 files). In addition to the LPJ-LMfire model output (netCDF files), this archive also contains a table of arboreal pollen percent calculated from pollen samples dated to the LGM at sites throughout (lgmAP.txt), and a table containing the location of archaeological sites dated to the LGM (LGM_archaeological_site_locations.txt).
Resumo:
Planktic foraminiferal faunas and modern analogue technique estimates of sea surface temperature (SST) for the last 1 million years (Myr) are compared between core sites to the north (ODP 1125, 178 faunas) and south (DSDP 594, 374 faunas) of the present location of the Subtropical Front (STF), east of New Zealand. Faunas beneath cool subtropical water (STW) north of the STF are dominated by dextral Neogloboquadrina pachyderma, Globorotalia inflata, and Globigerina bulloides, whereas faunas to the south are strongly dominated by sinistral N. pachyderma (80-95% in glacials), with increased G. bulloides (20-50%) and dextral N. pachyderma (15-50%) in interglacials (beneath Subantarctic Water, or SAW). Canonical correspondence analysis indicates that at both sites, SST and related factors were the most important environmental influences on faunal composition. Greater climate-related faunal fluctuations occur in the south. Significant faunal changes occur through time at both sites, particularly towards the end of the mid-Pleistocene climate transition, MIS18-15 (e.g., decline of Globorotalia crassula in STW, disappearance of Globorotalia puncticulata in SAW), and during MIS8-5. Interglacial SST estimates in the north are similar to the present day throughout the last 1 Myr. To the south, interglacial SSTs are more variable with peaks 4-7 °C cooler than present through much of the early and middle Pleistocene, but in MIS11, MIS5.5, and early MIS1, peaks are estimated to have been 2-4 °C warmer than present. These high temperatures are attributed to southward spread of the STF across the submarine Chatham Rise, along which the STF appears to have been dynamically positioned throughout most of the last 1 Myr. For much of the last 1 Myr, glacial SST estimates in the north were only 1-2 °C cooler than the present interglacial, except in MIS16, MIS8, MIS6, and MIS4-2 when estimates are 4-7 °C cooler. These cooler temperatures are attributed to jetting of SAW through the Mernoo Saddle (across the Chatham Rise) and/or waning of the STW current. To the south, glacial SST estimates were consistently 10-11 °C cooler than present, similar to temperatures and faunas currently found in the vicinity of the Polar Front. One interpretation is that these cold temperatures reflect thermocline changes and increased Circumpolar Surface Water spinning off the Subantarctic Front as an enhanced Bounty Gyre along the south side of the Chatham Rise. For most of the last 1 Myr, the temperature gradient across the STF has been considerably greater than the present 4 °C. During glacial episodes, the STF in this region did not migrate northwards, but instead there was an intensification of the temperature gradient across it (interglacials 4-11 °C; glacials 8-14 °C).
Resumo:
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient’s medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method.
Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated.
Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated.
Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Resumo:
Here we present orbitally-resolved records of terrestrial higher plant leaf wax input to the North Atlantic over the last 3.5 Ma, based on the accumulation of long-chain n-alkanes and n-alkanl-1-ols at IODP Site U1313. These lipids are a major component of dust, even in remote ocean areas, and have a predominantly aeolian origin in distal marine sediments. Our results demonstrate that around 2.7 million years ago (Ma), coinciding with the intensification of the Northern Hemisphere glaciation (NHG), the aeolian input of terrestrial material to the North Atlantic increased drastically. Since then, during every glacial the aeolian input of higher plant material was up to 30 times higher than during interglacials. The close correspondence between aeolian input to the North Atlantic and other dust records indicates a globally uniform response of dust sources to Quaternary climate variability, although the amplitude of variation differs among areas. We argue that the increased aeolian input at Site U1313 during glacials is predominantly related to the episodic appearance of continental ice sheets in North America and the associated strengthening of glaciogenic dust sources. Evolutional spectral analyses of the n-alkane records were therefore used to determine the dominant astronomical forcing in North American ice sheet advances. These results demonstrate that during the early Pleistocene North American ice sheet dynamics responded predominantly to variations in obliquity (41 ka), which argues against previous suggestions of precession-related variations in Northern Hemisphere ice sheets during the early Pleistocene.