946 resultados para Calculated based on Forel-Ule scale, FU21
Resumo:
This study presents design and construction of a tri-generation system (thermal efficiency, 63%), powered by neat nonedible plant oils (jatropha, pongamia and jojoba oil or standard diesel fuel), besides studies on plant performance and economics. Proposed plant consumes fuel (3 l/h) and produce ice (40 kg/h) by means of an adsorption refrigerator powered from the engine waste jacket water heat. Potential savings in green house gas (GHG) emissions of trigeneration system in comparison to cogeneration (or single generation) has also been discussed.
Resumo:
A novel approach of normal ECG recognition based on scale-space signal representation is proposed. The approach utilizes curvature scale-space signal representation used to match visual objects shapes previously and dynamic programming algorithm for matching CSS representations of ECG signals. Extraction and matching processes are fast and experimental results show that the approach is quite robust for preliminary normal ECG recognition.
Resumo:
A novel approach of automatic ECG analysis based on scale-scale signal representation is proposed. The approach uses curvature scale-space representation to locate main ECG waveform limits and peaks and may be used to correct results of other ECG analysis techniques or independently. Moreover dynamic matching of ECG CSS representations provides robust preliminary recognition of ECG abnormalities which has been proven by experimental results.
Resumo:
With the exponential increasing demands and uses of GIS data visualization system, such as urban planning, environment and climate change monitoring, weather simulation, hydrographic gauge and so forth, the geospatial vector and raster data visualization research, application and technology has become prevalent. However, we observe that current web GIS techniques are merely suitable for static vector and raster data where no dynamic overlaying layers. While it is desirable to enable visual explorations of large-scale dynamic vector and raster geospatial data in a web environment, improving the performance between backend datasets and the vector and raster applications remains a challenging technical issue. This dissertation is to implement these challenging and unimplemented areas: how to provide a large-scale dynamic vector and raster data visualization service with dynamic overlaying layers accessible from various client devices through a standard web browser, and how to make the large-scale dynamic vector and raster data visualization service as rapid as the static one. To accomplish these, a large-scale dynamic vector and raster data visualization geographic information system based on parallel map tiling and a comprehensive performance improvement solution are proposed, designed and implemented. They include: the quadtree-based indexing and parallel map tiling, the Legend String, the vector data visualization with dynamic layers overlaying, the vector data time series visualization, the algorithm of vector data rendering, the algorithm of raster data re-projection, the algorithm for elimination of superfluous level of detail, the algorithm for vector data gridding and re-grouping and the cluster servers side vector and raster data caching.
Resumo:
The Lena River Delta, situated in Northern Siberia (72.0 - 73.8° N, 122.0 - 129.5° E), is the largest Arctic delta and covers 29,000 km**2. Since natural deltas are characterised by complex geomorphological patterns and various types of ecosystems, high spatial resolution information on the distribution and extent of the delta environments is necessary for a spatial assessment and accurate quantification of biogeochemical processes as drivers for the emission of greenhouse gases from tundra soils. In this study, the first land cover classification for the entire Lena Delta based on Landsat 7 Enhanced Thematic Mapper (ETM+) images was conducted and used for the quantification of methane emissions from the delta ecosystems on the regional scale. The applied supervised minimum distance classification was very effective with the few ancillary data that were available for training site selection. Nine land cover classes of aquatic and terrestrial ecosystems in the wetland dominated (72%) Lena Delta could be defined by this classification approach. The mean daily methane emission of the entire Lena Delta was calculated with 10.35 mg CH4/m**2/d. Taking our multi-scale approach into account we find that the methane source strength of certain tundra wetland types is lower than calculated previously on coarser scales.
Resumo:
While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.
In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.
By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.
Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.
Resumo:
A low-threshold nanolaser with all three dimensions at the subwavelength scale is proposed and investigated. The nanolaser is constructed based on an asymmetric hybrid plasmonic F-P cavity with Ag-coated end facets. Lasing characteristics are calculated using finite element method at the wavelength of 1550 nm. The results show that owing to the low modal loss, large modal confinement factor of the asymmetric plasmonic cavity structure, in conjunction with the high reflectivity of the Ag reflectors, a minimum threshold gain of 240 cm−1 is predicted. Furthermore, the Purcell factor as large as 2518 is obtained with optimized structure parameters to enhance rates of spontaneous and stimulated emission.
Resumo:
Understanding and predicting patterns of distribution and abundance of marine resources is important for con- servation and management purposes in small-scale artisanal fisheries and industrial fisheries worldwide. The goose barnacle (Pollicipes pollicipes) is an important shellfish resource and its distribution is closely related to wave exposure at different spatial scales. We modelled the abundance (percent coverage) of P. pollicipes as a function of a simple wave exposure index based on fetch estimates from digitized coastlines at different spatial scales. The model accounted for 47.5% of the explained deviance and indicated that barnacle abundance increases non-linearly with wave exposure at both the smallest (metres) and largest (kilometres) spatial scales considered in this study. Distribution maps were predicted for the study region in SW Portugal. Our study suggests that the relationship between fetch-based exposure indices and P. pollicipes percent cover may be used as a simple tool for providing stakeholders with information on barnacle distribution patterns. This information may improve assessment of harvesting grounds and the dimension of exploitable areas, aiding management plans and support- ing decision making on conservation, harvesting pressure and surveillance strategies for this highly appreciated and socio- economically important marine resource.
Resumo:
We apply the collective consumption model of Browning, Chiappori and Lewbel (2006) to analyse economic well-being and poverty among the elderly. The model focuses on individual preferences, a consumption technology that captures the economies of scale of living in a couple, and a sharing rule that governs the intra-household allocation of resources. The model is applied to a time series of Dutch consumption expenditure surveys. Our empirical results indicate substantial economies of scale and a wifeís share that is increasing in total expenditures. We further calculated poverty rates by means of the collective consumption model. Collective poverty rates of widows and widowers turn out to be slightly lower than traditional ones based on a standard equivalence scale. Poverty among women (men) in elderly couples, however, seems to be heavily underestimated (overestimated) by the traditional approach. Finally, we analysed the impact of becoming a widow(er). Based on cross-sectional evidence, we find that the drop (increase) in material well-being following the husbandís death is substantial for women in high (low) expenditure couples. For men, the picture is reversed.
Resumo:
Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
This paper takes stock of current changes affecting journalism, and as a case study brings up to date the record of progress made with an online publishing enterprise, EUAustralia Online, first reported on in 2007. It perceives the development of news publishing on line as being in two sectors: media corporations moving to occupy the online publishing field, through complex business stratagems and product-making, and small publications enjoying low production costs and the ability to strike up relationships with numerous users, even on a mass scale. Recent developments in both major publishing and niche publishing are appraised in a literature review, considering both broad-scale industry trends; and pressure from fresh advances in information and communication technology to produce ever-more sophisticated media artefacts, like multi-platform news coverage. The paper also recounts the pattern of work on EUAustralia Online, showing how such publications, ubiquitous ‘blogs’ or newsletters, may be placed in a prospective online order, where large and small operations might co-exist.
Resumo:
In a power network, when a propagation energy wave caused by a disturbance hits a weak link, a reflection is appeared and some of energy is transferred across the link. In this work, an analytical descriptive methodology is proposed to study the dynamical stability of a large scale power system. For this purpose, the measured electrical indices (angle, or voltage/frequency) following a fault in different points among the network are used, and the behaviors of the propagated waves through the lines, nodes and buses are studied. This work addresses a new tool for power system stability analysis based on a descriptive study of electrical measurements. The proposed methodology is also useful to detect the contingency condition and synthesis of an effective emergency control scheme.