972 resultados para Thermal environment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research concentrates on the fabrication of bulk aluminum matrix nanocomposite structures with carbon nanotube reinforcement. The objective of the work was to fabricate and characterize multi-walled carbon nanotube (MWCNT) reinforced hypereutectic Al-Si (23 wt% Si, 2 wt% Ni, 1 wt% Cu, rest Al) nanocomposite bulk structure with nanocrystalline matrix through thermal spray forming techniques viz. plasma spray forming (PSF) and high velocity oxy-fuel (HVOF) spray forming. This is the first research study, which has shown that thermal spray forming can be successfully used to synthesize carbon nanotube reinforced nanocomposites. Microstructural characterization based on quantitative microscopy, scanning and transmission electron microscopy (SEM and TEM), energy dispersive spectroscopy (EDS), X-ray diffraction (XRD), Raman spectroscopy and X ray photoelectron spectroscopy (XPS) confirms (i) retention and macro/sub-macro level homogenous distribution of multiwalled carbon nanotubes in the Al-Si matrix and (ii) evolution of nanostructured grains in the matrix. Formation of ultrathin β-SiC layer on MWCNT surface, due to chemical reaction of Si atoms diffusing from Al-Si alloy and C atoms from the outer walls of MWCNTs has been confirmed theoretically and experimentally. The presence of SiC layer at the interface improves the wettability and the interfacial adhesion between the MWCNT reinforcement and the Al-Si matrix. Sintering of the as-sprayed nanocomposites was carried out in an inert environment for further densification. As-sprayed PSF nanocomposite showed lower microhardness compared to HVOF, due to the higher porosity content and lower residual stress. The hardness of the nanocomposites increased with sintering time due to effective pore removal. Uniaxial tensile test on CNT-bulk nanocomposite was carried out, which is the first ever study of such nature. The tensile test results showed inconsistency in the data attributed to inhomogeneous microstructure and limitation of the test samples geometry. The elastic moduli of nanocomposites were computed using different micromechanics models and compared with experimentally measured values. The elastic moduli of nanocomposites measured by nanoindentation technique, increased gradually with sintering attributed to porosity removal. The experimentally measured values conformed better with theoretically predicted values, particularly in the case of Hashin-Shtrikman bound method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermal reaction norms for growth rates of six Emiliania huxleyi isolates originating from the central Atlantic (Azores, Portugal) and five isolates from the coastal North Atlantic (Bergen, Norway) were assessed. We used the template mode of variation model to decompose variations in growth rates into modes of biological interest: vertical shift, horizontal shift, and generalist-specialist variation. In line with the actual habitat conditions, isolates from Bergen (Bergen population) grew well at lower temperatures, and isolates from the Azores (Azores population) performed better at higher temperatures. The optimum growth temperature of the Azores population was significantly higher than that of the Bergen population. Neutral genetic differentiation was found between populations by microsatellite analysis. These findings indicate that E. huxleyi populations are adapted to local temperature regimes. Next to between-population variation, we also found variation within populations. Genotype-by-environment interactions resulted in the most pronounced phenotypic differences when isolates were exposed to temperatures outside the range they naturally encounter. Variation in thermal reaction norms between and within populations emphasizes the importance of using more than one isolate when studying the consequences of global change on marine phytoplankton. Phenotypic plasticity and standing genetic variation will be important in determining the potential of natural E. huxleyi populations to cope with global climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anthropogenic CO2 emissions have caused seawater temperature elevation and ocean acidification. In view of both phenomena are occurring simultaneously, their combined effects on marine species must be experimentally evaluated. The purpose of this study was to estimate the combined effects of seawater acidification and temperature increase on the energy budget of the thick shell mussel Mytilus coruscus. Juvenile mussels were exposed to six combined treatments with three pH levels (8.1, 7.7 and 7.3) * two temperatures (25 °C and 30 °C) for 14 d. We found that clearance rates (CRs), food absorption efficiencies (AEs), respiration rates (RRs), ammonium excretion rates (ER), scope for growth (SFG) and O:N ratios were significantly reduced by elevated temperature sometimes during the whole experiments. Low pH showed significant negative effects on RR and ER, and significantly increased O:N ratios, but showed almost no effects on CR, AE and SFG of M. coruscus. Nevertheless, their interactive effects were observed in RR, ER and O:N ratios. PCA revealed positive relationships among most physiological indicators, especially between SFG and CR under normal temperatures compared to high temperatures. PCA also showed that the high RR was closely correlated to an increasing ER with increasing pH levels. These results suggest that physiological energetics of juvenile M. coruscus are able to acclimate to CO2 acidification with a little physiological effect, but not increased temperatures. Therefore, the negative effects of a temperature increase could potentially impact the ecophysiological responses of M. coruscus and have significant ecological consequences, mainly in those habitats where this species is dominant in terms of abundance and biomass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within Canada there are more than 2.5 million bundles of spent nuclear fuel with another approximately 2 million bundles to be generated in the future. Canada, and every country around the world that has taken a decision on management of spent nuclear fuel, has decided on long-term containment and isolation of the fuel within a deep geological repository. At depth, a deep geological repository consists of a network of placement rooms where the bundles will be located within a multi-layered system that incorporates engineered and natural barriers. The barriers will be placed in a complex thermal-hydraulic-mechanical-chemical-biological (THMCB) environment. A large database of material properties for all components in the repository are required to construct representative models. Within the repository, the sealing materials will experience elevated temperatures due to the thermal gradient produced by radioactive decay heat from the waste inside the container. Furthermore, high porewater pressure due to the depth of repository along with possibility of elevated salinity of groundwater would cause the bentonite-based materials to be under transient hydraulic conditions. Therefore it is crucial to characterize the sealing materials over a wide range of thermal-hydraulic conditions. A comprehensive experimental program has been conducted to measure properties (mainly focused on thermal properties) of all sealing materials involved in Mark II concept at plausible thermal-hydraulic conditions. The thermal response of Canada’s concept for a deep geological repository has been modelled using experimentally measured thermal properties. Plausible scenarios are defined and the effects of these scenarios are examined on the container surface temperature as well as the surrounding geosphere to assess whether they meet design criteria for the cases studied. The thermal response shows that if all the materials even being at dried condition, repository still performs acceptably as long as sealing materials remain in contact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The urban heat island effect is often associated with large metropolises. However, in the Netherlands even small cities will be affected by the phenomenon in the future (Hove et al., 2011), due to the dispersed or mosaic urbanisation patterns in particularly the southern part of the country: the province of North Brabant. This study analyses the average night time land surface temperature (LST) of 21 North-Brabant urban areas through 22 satellite images retrieved by Modis 11A1 during the 2006 heat wave and uses Landsat 5 Thematic Mapper to map albedo and normalized difference temperature index (NDVI) values. Albedo, NDVI and imperviousness are found to play the most relevant role in the increase of nighttime LST. The surface cover cluster analysis of these three parameters reveals that the 12 “urban living environment” categories used in the region of North Brabant can actually be reduced to 7 categories, which simplifies the design guidelines to improve the surface thermal behaviour of the different neighbourhoods thus reducing the Urban Heat Island (UHI) effect in existing medium size cities and future developments adjacent to those cities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to optimize frontal detection in sea surface temperature fields at 4 km resolution, a combined statistical and expert-based approach is applied to test different spatial smoothing of the data prior to the detection process. Fronts are usually detected at 1 km resolution using the histogram-based, single image edge detection (SIED) algorithm developed by Cayula and Cornillon in 1992, with a standard preliminary smoothing using a median filter and a 3 × 3 pixel kernel. Here, detections are performed in three study regions (off Morocco, the Mozambique Channel, and north-western Australia) and across the Indian Ocean basin using the combination of multiple windows (CMW) method developed by Nieto, Demarcq and McClatchie in 2012 which improves on the original Cayula and Cornillon algorithm. Detections at 4 km and 1 km of resolution are compared. Fronts are divided in two intensity classes (“weak” and “strong”) according to their thermal gradient. A preliminary smoothing is applied prior to the detection using different convolutions: three type of filters (median, average and Gaussian) combined with four kernel sizes (3 × 3, 5 × 5, 7 × 7, and 9 × 9 pixels) and three detection window sizes (16 × 16, 24 × 24 and 32 × 32 pixels) to test the effect of these smoothing combinations on reducing the background noise of the data and therefore on improving the frontal detection. The performance of the combinations on 4 km data are evaluated using two criteria: detection efficiency and front length. We find that the optimal combination of preliminary smoothing parameters in enhancing detection efficiency and preserving front length includes a median filter, a 16 × 16 pixel window size, and a 5 × 5 pixel kernel for strong fronts and a 7 × 7 pixel kernel for weak fronts. Results show an improvement in detection performance (from largest to smallest window size) of 71% for strong fronts and 120% for weak fronts. Despite the small window used (16 × 16 pixels), the length of the fronts has been preserved relative to that found with 1 km data. This optimal preliminary smoothing and the CMW detection algorithm on 4 km sea surface temperature data are then used to describe the spatial distribution of the monthly frequencies of occurrence for both strong and weak fronts across the Indian Ocean basin. In general strong fronts are observed in coastal areas whereas weak fronts, with some seasonal exceptions, are mainly located in the open ocean. This study shows that adequate noise reduction done by a preliminary smoothing of the data considerably improves the frontal detection efficiency as well as the global quality of the results. Consequently, the use of 4 km data enables frontal detections similar to 1 km data (using a standard median 3 × 3 convolution) in terms of detectability, length and location. This method, using 4 km data is easily applicable to large regions or at the global scale with far less constraints of data manipulation and processing time relative to 1 km data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal welfare issues have received much attention not only to supply farmed animal requirements, but also to ethical and cultural public concerns. Daily collected information, as well as the systematic follow-up of production stages, produces important statistical data for production assessment and control, as well as for improvement possibilities. In this scenario, this research study analyzed behavioral, production, and environmental data using Main Component Multivariable Analysis, which correlated observed behaviors, recorded using video cameras and electronic identification, with performance parameters of female broiler breeders. The aim was to start building a system to support decision-making in broiler breeder housing, based on bird behavioral parameters. Birds were housed in an environmental chamber, with three pens with different controlled environments. Bird sensitivity to environmental conditions were indicated by their behaviors, stressing the importance of behavioral observations for modern poultry management. A strong association between performance parameters and the behavior at the nest, suggesting that this behavior may be used to predict productivity. The behaviors of ruffling feathers, opening wings, preening, and at the drinker were negatively correlated with environmental temperature, suggesting that the increase of in the frequency of these behaviors indicate improvement of thermal welfare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric scattering plays a crucial rule in degrading the performance of electro optical imaging systems operating in the visible and infra-red spectral bands, and hence limits the quality of the acquired images, either through reduction of contrast or increase of image blur. The exact nature of light scattering by atmospheric media is highly complex and depends on the types, orientations, sizes and distributions of particles constituting these media, as well as wavelengths, polarization states and directions of the propagating radiation. Here we follow the common approach for solving imaging and propagation problems by treating the propagating light through atmospheric media as composed of two main components: a direct (unscattered), and a scattered component. In this work we developed a detailed model of the effects of absorption and scattering by haze and fog atmospheric aerosols on the optical radiation propagating from the object plane to an imaging system, based on the classical theory of EM scattering. This detailed model is then used to compute the average point spread function (PSF) of an imaging system which properly accounts for the effects of the diffraction, scattering, and the appropriate optical power level of both the direct and the scattered radiation arriving at the pupil of the imaging system. Also, the calculated PSF, properly weighted for the energy contributions of the direct and scattered components is used, in combination with a radiometric model, to estimate the average number of the direct and scattered photons detected at the sensor plane, which are then used to calculate the image spectrum signal to- noise ratio (SNR) in the visible near infra-red (NIR) and mid infra-red (MIR) spectral wavelength bands. Reconstruction of images degraded by atmospheric scattering and measurement noise is then performed, up to the limit imposed by the noise effective cutoff spatial frequency of the image spectrum SNR. Key results of this research are as follows: A mathematical model based on Mie scattering theory for how scattering from aerosols affects the overall point spread function (PSF) of an imaging system was developed, coded in MATLAB, and demonstrated. This model along with radiometric theory was used to predict the limiting resolution of an imaging system as a function of the optics, scattering environment, and measurement noise. Finally, image reconstruction algorithms were developed and demonstrated which mitigate the effects of scattering-induced blurring to within the limits imposed by noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the influence of the extreme Saharan desert dust (DD) event on shortwave (SW) and longwave (LW) radiation at the EARLINET/AERONET Évora station (Southern Portugal) from 4 up to 7 April 2011. There was also some cloud occurrence in the period. In this context, it is essential to quantify the effect of cloud presence on aerosol radiative forcing. A radiative transfer model was initialized with aerosol optical properties, cloud vertical properties and meteorological atmospheric vertical profiles. The intercomparison between the instantaneous TOA shortwave and longwave fluxes derived using CERES and those calculated using SBDART, which was fed with aerosol extinction coefficients derived from the CALIPSO and lidar-PAOLI observations, varying OPAC dataset parameters, was reasonably acceptable within the standard deviations. The dust aerosol type that yields the best fit was found to be the mineral accumulation mode. Therefore, SBDART model constrained with the CERES observations can be used to reliably determine aerosol radiative forcing and heating rates. Aerosol radiative forcings and heating rates were derived in the SW (ARFSw, AHRSw) and LW (ARFLw, AHRLw) spectral ranges, considering a cloud-aerosol free reference atmosphere. We found that AOD at 440 nm increased by a factor of 5 on 6 April with respect to the lower dust load on 4 April. It was responsible by a strong cooling radiative effect pointed out by the ARFSw value (−99 W/m2 for a solar zenith angle of 60°) offset by a warming radiative effect according to ARFLw value (+21.9 W/m2) at the surface. Overall, about 24% and 12% of the dust solar radiative cooling effect is compensated by its longwave warming effect at the surface and at the top of the atmosphere, respectively. Hence, larger aerosol loads could enhance the response between the absorption and re-emission processes increasing the ARFLw with respect to those associated with moderate and low aerosol loads. The unprecedented results derived from this work complement the findings in other regions on the modifications of radiative energy budget by the dust aerosols, which could have relevant influences on the regional climate and will be topics for future investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design process of any electric vehicle system has to be oriented towards the best energy efficiency, together with the constraint of maintaining comfort in the vehicle cabin. Main aim of this study is to research the best thermal management solution in terms of HVAC efficiency without compromising occupant’s comfort and internal air quality. An Arduino controlled Low Cost System of Sensors was developed and compared against reference instrumentation (average R-squared of 0.92) and then used to characterise the vehicle cabin in real parking and driving conditions trials. Data on the energy use of the HVAC was retrieved from the car On-Board Diagnostic port. Energy savings using recirculation can reach 30 %, but pollutants concentration in the cabin builds up in this operating mode. Moreover, the temperature profile appeared strongly nonuniform with air temperature differences up to 10° C. Optimisation methods often require a high number of runs to find the optimal configuration of the system. Fast models proved to be beneficial for these task, while CFD-1D model are usually slower despite the higher level of detail provided. In this work, the collected dataset was used to train a fast ML model of both cabin and HVAC using linear regression. Average scaled RMSE over all trials is 0.4 %, while computation time is 0.0077 ms for each second of simulated time on a laptop computer. Finally, a reinforcement learning environment was built in OpenAI and Stable-Baselines3 using the built-in Proximal Policy Optimisation algorithm to update the policy and seek for the best compromise between comfort, air quality and energy reward terms. The learning curves show an oscillating behaviour overall, with only 2 experiments behaving as expected even if too slow. This result leaves large room for improvement, ranging from the reward function engineering to the expansion of the ML model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flaring has been widely used in the upstream operation of the oil and gas industry, both onshore and offshore. It is considered a safe and reliable way to protect assets from overpressure and the environment from toxic gas using combustion. However, there are drawbacks to using flares, such as vibration and thermal radiation. Excessive contact with thermal radiation is harmful to offshore personnel and equipment. Research organizations and companies have invested time and money to combat this. Many technologies have been developed so far to reduce the risk of thermal radiation, one of them being the water curtain system. Several tests were done to see the effectiveness of the water curtain system in mitigating thermal radiation in an offshore environment. Each test varied in the flare output, wind speed, and the size of water droplets size of the water curtain. Later, the results of each test were compared and analyzed. The results showed that a water curtain system could be a solution to excessive thermal radiation that comes from an offshore flare. Moreover, the water curtain with smaller water droplets diameter gives a more favorable result in reducing thermal radiation. These results suggest that, although it offers simplicity and efficiency, designing an efficient water curtain system requires deep study. Various conditions, such as wind speed, flare intensity, and the size of the water droplets, plays a vital role in the effectiveness of the water curtain system in attenuating thermal radiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, product development in all its phases plays a fundamental role in the industrial chain. The need for a company to compete at high levels, the need to be quick in responding to market demands and therefore to be able to engineer the product quickly and with a high level of quality, has led to the need to get involved in new more advanced methods/ processes. In recent years, we are moving away from the concept of 2D-based design and production and approaching the concept of Model Based Definition. By using this approach, increasingly complex systems turn out to be easier to deal with but above all cheaper in obtaining them. Thanks to the Model Based Definition it is possible to share data in a lean and simple way to the entire engineering and production chain of the product. The great advantage of this approach is precisely the uniqueness of the information. In this specific thesis work, this approach has been exploited in the context of tolerances with the aid of CAD / CAT software. Tolerance analysis or dimensional variation analysis is a way to understand how sources of variation in part size and assembly constraints propagate between parts and assemblies and how that range affects the ability of a project to meet its requirements. It is critically important to note how tolerance directly affects the cost and performance of products. Worst Case Analysis (WCA) and Statistical analysis (RSS) are the two principal methods in DVA. The thesis aims to show the advantages of using statistical dimensional analysis by creating and examining various case studies, using PTC CREO software for CAD modeling and CETOL 6σ for tolerance analysis. Moreover, it will be provided a comparison between manual and 3D analysis, focusing the attention to the information lost in the 1D case. The results obtained allow us to highlight the need to use this approach from the early stages of the product design cycle.