950 resultados para thermal decomposition
Resumo:
A numerical investigation has been carried out for the coupled thermal boundary layers on both sides of a partition placed in an isosceles triangular enclosure along its middle symmetric line. The working fluid is considered as air which is initially quiescent. A sudden temperature difference between two zones of the enclosure has been imposed to trigger the natural convection. It is anticipated from the numerical simulations that the coupled thermal boundary layers development adjacent to the partition undergoes three distinct stages; namely an initial stage, a transitional stage and a steady state stage. Time dependent features of the coupled thermal boundary layers as well as the overall natural convection flow in the partitioned enclosure have been discussed and compared with the non-partitioned enclosure. Moreover, heat transfer as a form of local and overall average Nusselt number through the coupled thermal boundary layers and the inclined walls is also examined.
Resumo:
Background Many Australian cities experience large winter increases in deaths and hospitalisations. Flu outbreaks are only part of the problem and inadequate protection from cold weather is a key independent risk factor. Better home insulation has been shown to improve health during winter, but no study has examined whether better personal insulation improves health. Data and Methods We ran a randomised controlled trial of thermal clothing versus usual care. Subjects with heart failure (a group vulnerable to cold) were recruited from a public hospital in Brisbane in winter and followed-up at the end of winter. Those randomised to the intervention received two thermal hats and tops and a digital thermometer. The primary outcome was the number of days in hospital, with secondary outcomes of General Practitioner (GP) visits and self-rated health. Results The mean number of days in hospital per 100 winter days was 2.5 in the intervention group and 1.8 in the usual care group, with a mean difference of 0.7 (95% CI: –1.5, 5.4). The intervention group had 0.2 fewer GP visits on average (95% CI: –0.8, 0.3), and a higher self-rated health, mean improvement –0.3 (95% CI: –0.9, 0.3). The thermal tops were generally well used, but even in cold temperatures the hats were only worn by 30% of subjects. Conclusions Thermal clothes are a cheap and simple intervention, but further work needs to be done on increasing compliance and confirming the health and economic benefits of providing thermals to at-risk groups.
Resumo:
A new bioluminescent creatine kinase (CK) assay using purified luciferase was used to analyse CK activity in serum samples dried on filter paper. Enzyme activity was preserved for over 1 wk on paper stored at room temperature. At 60°C, CK activity in liquid serum samples was rapidly inactivated, but the activity of enzyme stored on paper was preserved for at least 2 days.
Resumo:
The low-altitude aircraft inspection of powerlines, or other linear infrastructure networks, is emerging as an important application requiring specialised control technologies. Despite some recent advances in automated control related to this application, control of the underactuated aircraft vertical dynamics has not been completely achieved, especially in the presence of thermal disturbances. Rejection of thermal disturbances represents a key challenge to the control of inspection aircraft due to the underactuated nature of the dynamics and specified speed, altitude, and pitch constraints. This paper proposes a new vertical controller consisting of a backstepping elevator controller with feedforward-feedback throttle controller. The performance of our proposed approach is evaluated against two existing candidate controllers.
Resumo:
Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.
Resumo:
Objective To describe the trend of overall mortality and major causes of death in Shandong population from 1970 to 2005,and to quantitatively estimate the influential factors. Methods Trends of overall mortality and major causes of death were described using indicators such as mortality rates and age-adjusted death rates by comparing three large-scale mortality surveys in Shandong province. Difference decomposing method was applied to estimate the contribution of demographic and non-demographic factors for the change of mortality. Results The total mortality had had a slight change since 1970s,but had increased since 1990s.However,both the mortality rates of age-adjusted and age-specific decreased significantly. The mortality of Group Ⅰ diseases including infectious diseases as well maternal and perinatal diseases decreased drastically. By contrast, the mortality of non-communicable chronic diseases (NCDs)including cardiovascular diseases(CVDs),cancer and injuries increased. The sustentation of recent overall mortality was caused by the interaction of demographic and non-demographic factors which worked oppositely. Non-demographic factors were responsible for the decrease of Group Ⅰ disease and the increase of injuries. With respect to the increase of NCDs as a whole. Demographic factors might take the full responsibility and the non-demographic factors were the opposite force to reduce the mortality. Nevertheless, for the increase of some leading NCD diseases as CVDs and cancer, the increase was mainly due to non-demographic rather than demographic factors. Conclusion Through the interaction of the aggravation of ageing population and the enhancement of non-demographic effect, the overall mortality in Shandong would maintain a balance or slightly rise in the coming years. Group Ⅰ diseases in Shandong had been effectively under control. Strategies focusing on disease control and prevention should be transferred to chronic diseases, especially leading NCDs, such as CVDs and cancer.
Resumo:
Faulted stacking layers are ubiquitously observed during the crystal growth of semiconducting nanowires (NWs). In this paper, we employ the reverse non-equilibrium molecular dynamics simulation to elucidate the effect of various faulted stacking layers on the thermal conductivity (TC) of silicon (Si) NWs. We find that the stacking faults can greatly reduce the TC of the Si NW. Among the different stacking faults that are parallel to the NW's axis, the 9R polytype structure, the intrinsic and extrinsic stacking faults (iSFs and eSFs) exert more pronounced effects in the reduction of TC than the twin boundary (TB). However, for the perpendicularly aligned faulted stacking layers, the eSFs and 9R polytype structures are observed to induce a larger reduction to the TC of the NW than the TB and iSFs. For all considered NWs, the TC does not show a strong relation with the increasing number of faulted stacking layers. Our studies suggest the possibility of tuning the thermal properties of Si NWs by altering the crystal structure via the different faulted stacking layers.
Resumo:
Taguchi method is for the first time applied to optimize the synthesis of graphene films by copper-catalyzed decomposition of ethanol. In order to find the most appropriate experimental conditions for the realization of thin high-grade films, six experiments suitably designed and performed. The influence of temperature (1000–1070 °C) and synthesis duration (1–30 min) and hydrogen flow (0–100 sccm) on the number of graphene layers and defect density in the graphitic lattice was ranked by monitoring the intensity of the 2D- and D-bands relative to the G-band in the Raman spectra. After critical examination and adjusting of the conditions predicted to give optimal results, a continuous film consisting of 2–4 nearly defect-free graphene layers was obtained.
Resumo:
Synthesis of MgC2O4⋅2H2O nano particles was carried out by thermal double decomposition of solutions of oxalic acid dihydrate (C2H2O4⋅2H2O) and Mg(OAc)2⋅4H2O employing CATA-2R microwave reactor. Structural elucidation was carried out by employing X-ray diffraction (XRD), particle size and shape were studied by transmission electron microscopy (TEM) and nature of bonding was investigated by optical absorption and near-infrared (NIR) spectral studies. The powder resulting from this method is pure and possesses distorted rhombic octahedral structure. The synthesized nano rod is 80 nm in diameter and 549 nm in length.
Resumo:
Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.
Resumo:
Long-term autonomy in robotics requires perception systems that are resilient to unusual but realistic conditions that will eventually occur during extended missions. For example, unmanned ground vehicles (UGVs) need to be capable of operating safely in adverse and low-visibility conditions, such as at night or in the presence of smoke. The key to a resilient UGV perception system lies in the use of multiple sensor modalities, e.g., operating at different frequencies of the electromagnetic spectrum, to compensate for the limitations of a single sensor type. In this paper, visual and infrared imaging are combined in a Visual-SLAM algorithm to achieve localization. We propose to evaluate the quality of data provided by each sensor modality prior to data combination. This evaluation is used to discard low-quality data, i.e., data most likely to induce large localization errors. In this way, perceptual failures are anticipated and mitigated. An extensive experimental evaluation is conducted on data sets collected with a UGV in a range of environments and adverse conditions, including the presence of smoke (obstructing the visual camera), fire, extreme heat (saturating the infrared camera), low-light conditions (dusk), and at night with sudden variations of artificial light. A total of 240 trajectory estimates are obtained using five different variations of data sources and data combination strategies in the localization method. In particular, the proposed approach for selective data combination is compared to methods using a single sensor type or combining both modalities without preselection. We show that the proposed framework allows for camera-based localization resilient to a large range of low-visibility conditions.
Resumo:
Many applications can benefit from the accurate surface temperature estimates that can be made using a passive thermal-infrared camera. However, the process of radiometric calibration which enables this can be both expensive and time consuming. An ad hoc approach for performing radiometric calibration is proposed which does not require specialized equipment and can be completed in a fraction of the time of the conventional method. The proposed approach utilizes the mechanical properties of the camera to estimate scene temperatures automatically, and uses these target temperatures to model the effect of sensor temperature on the digital output. A comparison with a conventional approach using a blackbody radiation source shows that the accuracy of the method is sufficient for many tasks requiring temperature estimation. Furthermore, a novel visualization method is proposed for displaying the radiometrically calibrated images to human operators. The representation employs an intuitive coloring scheme and allows the viewer to perceive a large variety of temperatures accurately.
Resumo:
Exploring thermal transport in graphene-polymer nanocomposite is significant to its applications with better thermal properties. Interfacial thermal conductance between graphene and polymer matrix plays a critical role in the improvement of thermal conductivity of graphene-polymer nanocomposite. Unfortunately, it is still challenging to understand the interfacial thermal transport between graphene nanofiller and polymer matrix at small material length scale. To this end, using non-equilibrium molecular dynamics simulations, we investigate the interfacial thermal conductance of graphene-polyethylene (PE) nanocomposite. The influence of functionalization with hydrocarbon chains on the interfacial thermal conductance of graphene-polymer nanocomposites was studied, taking into account of the effects of model size and thermal conductivity of graphene. An analytical model is also used to calculate the thermal conductivity of nanocomposite. The results are considered to contribute to development of new graphene-polymer nanocomposites with tailored thermal properties.
Resumo:
The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.
Resumo:
Real world business process models may consist of hundreds of elements and have sophisticated structure. Although there are tasks where such models are valuable and appreciated, in general complexity has a negative influence on model comprehension and analysis. Thus, means for managing the complexity of process models are needed. One approach is abstraction of business process models-creation of a process model which preserves the main features of the initial elaborate process model, but leaves out insignificant details. In this paper we study the structural aspects of process model abstraction and introduce an abstraction approach based on process structure trees (PST). The developed approach assures that the abstracted process model preserves the ordering constraints of the initial model. It surpasses pattern-based process model abstraction approaches, allowing to handle graph-structured process models of arbitrary structure. We also provide an evaluation of the proposed approach.