198 resultados para APPROXIMATE SOLUTIONS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new decision-making tool that will assist designers in the selection of appropriate daylighting solutions for buildings in tropical locations has been previously proposed by the authors. Through an evaluation matrix that prioritizes the parameters that best respond to the needs of tropical climates (e.g. reducing solar gain and protection from glare) the tool determines the most appropriate devices for specific climate and building inputs. The tool is effective in demonstrating the broad benefits and limitations of the different daylight strategies for buildings in the tropics. However for thorough analysis and calibration of the tool, validation is necessary. This paper presents a first step in the validation process. RADIANCE simulations were conducted to compare simulation performance with the performance predicted by the tool. To this end, an office building case study in subtropical Brisbane, Australia, and five different daylighting devices including openings, light guiding systems and light transport systems were simulated. Illuminance, light uniformity, daylight penetration and glare analysis were assessed for each device. The results indicate the tool can appropriately rank and recommend daylighting strategies based on specific building inputs for tropical and subtropical regions, making it a useful resource for designers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter considers to what degree the careers of women with young families, both in and out of paid employment, are lived as contingent, intersubjective projects pursued across time and space, in the social condition of growing biographical possibilities and uneven social/ideological change. Their resolutions of competing priorities by engaging in various permutations of home-work and paid work are termed ‘workable solutions’, with an intentional play on the double sense of ‘work’ – firstly as labour, thus being able to perform work, whether paid or not; secondly as in being able to make things work or function in the family unit’s best interests, however defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic differential equations (SDEs) arise fi om physical systems where the parameters describing the system can only be estimated or are subject to noise. There has been much work done recently on developing numerical methods for solving SDEs. This paper will focus on stability issues and variable stepsize implementation techniques for numerically solving SDEs effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing use of computerized systems in our daily lives creates new adversarial opportunities for which complex mechanisms are exploited to mend the rapid development of new attacks. Behavioral Biometrics appear as one of the promising response to these attacks. But it is a relatively new research area, specific frameworks for evaluation and development of behavioral biometrics solutions could not be found yet. In this paper we present a conception of a generic framework and runtime environment which will enable researchers to develop, evaluate and compare their behavioral biometrics solutions with repeatable experiments under the same conditions with the same data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient numerical method to compute nonlinear solutions for two-dimensional steady free-surface flow over an arbitrary channel bottom topography is presented. The approach is based on a boundary integral equation technique which is similar to that of Vanden-Broeck's (1996, J. Fluid Mech., 330, 339-347). The typical approach for this problem is to prescribe the shape of the channel bottom topography, with the free-surface being provided as part of the solution. Here we take an inverse approach and prescribe the shape of the free-surface a priori while solving for the corresponding bottom topography. We show how this inverse approach is particularly useful when studying topographies that give rise to wave-free solutions, allowing us to easily classify eleven basic flow types. Finally, the inverse approach is also adapted to calculate a distribution of pressure on the free-surface, given the free-surface shape itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation has shown that by transforming free caustic in red mud (RM) to Bayer hydrotalcite (during the seawater neutralization (SWN) process) enables a more controlled release mechanism for the neutralization of acid sulfate soils. The formation of hydrotalcite has been confirmed by X-ray diffraction (XRD) and differential thermalgravimetric analysis (DTG), while the dissolution of hydrotalcite and sodalite has been observed through XRD, DTG, pH plots, and ICP-OES. Coupling of all techniques enabled three neutralization mechanisms to be determined: (1) free alkali, (2) hydrotalcite dissolution, and (3) sodalite dissolution. The mechanisms are determined on the basis of ICP-OES and kinetic information. When the mass of RM or SWN-RM is greater than 0.08 g/50 mL, the pH of solution increases to a suitable value for plant life with aluminum leaching kept at a minimum. To obtain a neutralization pH greater than 6 in 10 min, the following ratio of bauxite residue (g) in 50 mL with a known iron sulfate (Fe2(SO4)3) concentration can be determined as follows: 0.04 g:50 mL:0.1 g/L of Fe2(SO4)3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pretreatment of sugarcane bagasse with acidified aqueous glycerol solution was evaluated at both laboratory and pilot scales. Laboratory scale pretreatment (4.00 g dry mass in 40.00 g liquid) with glycerol solutions containing ≤ 20 wt% water and 1.2 wt% HCl at 130 °C for 60 min resulted in biomass having glucan digestibilities of ≥ 88%. Comparable glucan enzymatic digestibility of 90% was achieved with bagasse pretreated at pilot scale (10 kg dry mass in 60 kg liquid) using a glycerol solution containing 0.4 wt% HCl and 17 wt% water at 130 °C for 15 min. We attribute more efficient pretreatment at pilot scale (despite shorter reaction time and reduced acid content) to improved mixing and heat transfer in a horizontal reactor. Pretreatment of sugarcane bagasse with acid-catalysed glycerol solutions likely produces glycerol-glycosides, which together with hydrolysed lignin are potential substrates for the production of biopolymers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fouling of industrial surfaces by silica and calcium oxalate can be detrimental to a number of process streams. Solution chemistry plays a large roll in the rate and type of scale formed on industrial surfaces. This study is on the kinetics and thermodynamics of SiO2 and calcium oxalate composite formation in solutions containing Mg2+ ions, trans-aconitic acid and sucrose, to mimic factory sugar cane juices. The induction time (ti) of silicic acid polymerization is found to be dependent on the sucrose concentration and SiO2 supersaturation ratio (SS). Generalized kinetic and solubility models are developed for SiO2 and calcium oxalate in binary systems using response surface methodology. The role of sucrose, Mg, trans-aconitic acid, a mixture of Mg and trans-aconitic acid, SiO2 SS ratio and Ca in the formation of com- posites is explained using the solution properties of these species including their ability to form complexes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work experimentally examines the performance benefits of a regional CORS network to the GPS orbit and clock solutions for supporting real-time Precise Point Positioning (PPP). The regionally enhanced GPS precise orbit solutions are derived from a global evenly distributed CORS network added with a densely distributed network in Australia and New Zealand. A series of computational schemes for different network configurations are adopted in the GAMIT-GLOBK and PANDA data processing. The precise GPS orbit results show that the regionally enhanced solutions achieve the overall orbit improvements with respect to the solutions derived from the global network only. Additionally, the orbital differences over GPS satellite arcs that are visible by any of the five Australia-wide CORS stations show a higher percentage of overall improvements compared to the satellite arcs that are not visible from these stations. The regional GPS clock and Uncalibrated Phase Delay (UPD) products are derived using the PANDA real time processing module from Australian CORS networks of 35 and 79 stations respectively. Analysis of PANDA kinematic PPP and kinematic PPP-AR solutions show certain overall improvements in the positioning performance from a denser network configuration after solution convergence. However, the clock and UPD enhancement on kinematic PPP solutions is marginal. It is suggested that other factors, such as effects of ionosphere, incorrectly fixed ambiguities, may be the more dominating, deserving further research attentions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.