633 resultados para Real blow up
em Queensland University of Technology - ePrints Archive
Resumo:
We treat two related moving boundary problems. The first is the ill-posed Stefan problem for melting a superheated solid in one Cartesian coordinate. Mathematically, this is the same problem as that for freezing a supercooled liquid, with applications to crystal growth. By applying a front-fixing technique with finite differences, we reproduce existing numerical results in the literature, concentrating on solutions that break down in finite time. This sort of finite-time blow-up is characterised by the speed of the moving boundary becoming unbounded in the blow-up limit. The second problem, which is an extension of the first, is proposed to simulate aspects of a particular two-phase Stefan problem with surface tension. We study this novel moving boundary problem numerically, and provide results that support the hypothesis that it exhibits a similar type of finite-time blow-up as the more complicated two-phase problem. The results are unusual in the sense that it appears the addition of surface tension transforms a well-posed problem into an ill-posed one.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.
Resumo:
The addition of surface tension to the classical Stefan problem for melting a sphere causes the solution to blow up at a finite time before complete melting takes place. This singular behaviour is characterised by the speed of the solid-melt interface and the flux of heat at the interface both becoming unbounded in the blow-up limit. In this paper, we use numerical simulation for a particular energy-conserving one-phase version of the problem to show that kinetic undercooling regularises this blow-up, so that the model with both surface tension and kinetic undercooling has solutions that are regular right up to complete melting. By examining the regime in which the dimensionless kinetic undercooling parameter is small, our results demonstrate how physically realistic solutions to this Stefan problem are consistent with observations of abrupt melting of nanoscaled particles.
Resumo:
The melting temperature of a nanoscaled particle is known to decrease as the curvature of the solid-melt interface increases. This relationship is most often modelled by a Gibbs--Thomson law, with the decrease in melting temperature proposed to be a product of the curvature of the solid-melt interface and the surface tension. Such a law must break down for sufficiently small particles, since the curvature becomes singular in the limit that the particle radius vanishes. Furthermore, the use of this law as a boundary condition for a Stefan-type continuum model is problematic because it leads to a physically unrealistic form of mathematical blow-up at a finite particle radius. By numerical simulation, we show that the inclusion of nonequilibrium interface kinetics in the Gibbs--Thomson law regularises the continuum model, so that the mathematical blow up is suppressed. As a result, the solution continues until complete melting, and the corresponding melting temperature remains finite for all time. The results of the adjusted model are consistent with experimental findings of abrupt melting of nanoscaled particles. This small-particle regime appears to be closely related to the problem of melting a superheated particle.
Resumo:
Under certain conditions, the mathematical models governing the melting of nano-sized particles predict unphysical results, which suggests these models are incomplete. This thesis studies the addition of different physical effects to these models, using analytic and numerical techniques to obtain realistic and meaningful results. In particular, the mathematical "blow-up" of solutions to ill-posed Stefan problems is examined, and the regularisation of this blow-up via kinetic undercooling. Other effects such as surface tension, density change and size-dependent latent heat of fusion are also analysed.
Resumo:
The mathematical model of a steadily propagating Saffman-Taylor finger in a Hele-Shaw channel has applications to two-dimensional interacting streamer discharges which are aligned in a periodic array. In the streamer context, the relevant regularisation on the interface is not provided by surface tension, but instead has been postulated to involve a mechanism equivalent to kinetic undercooling, which acts to penalise high velocities and prevent blow-up of the unregularised solution. Previous asymptotic results for the Hele-Shaw finger problem with kinetic undercooling suggest that for a given value of the kinetic undercooling parameter, there is a discrete set of possible finger shapes, each analytic at the nose and occupying a different fraction of the channel width. In the limit in which the kinetic undercooling parameter vanishes, the fraction for each family approaches 1/2, suggesting that this selection of 1/2 by kinetic undercooling is qualitatively similar to the well-known analogue with surface tension. We treat the numerical problem of computing these Saffman-Taylor fingers with kinetic undercooling, which turns out to be more subtle than the analogue with surface tension, since kinetic undercooling permits finger shapes which are corner-free but not analytic. We provide numerical evidence for the selection mechanism by setting up a problem with both kinetic undercooling and surface tension, and numerically taking the limit that the surface tension vanishes.
Resumo:
The PISA assessment instruments for students’ scientific literacy in 2000, 2003 and 2006 have each consisted of units made up of a real world context involving Science and Technology, about which students are asked a number of cognitive and affective questions. This paper discusses a number of issues from this use of S&T contexts in PISA and the implications they have for the current renewed interest in context-based science education. Suitably chosen contexts can engage both boys and girls. Secondary analyses of the students’ responses using the contextual sets of items as the unit of analysis provides new information about the levels of performance in PISA 2006 Science. .Embedding affective items in the achievement test did not lead to gender/context interactions of significance, and context interactions were less than competency ones. A number of implications for context-based science teaching and learning are outlined and the PISA 2006 Science test is suggested as a model for its assessment.
Resumo:
This paper presents the preliminary results in establishing a strategy for predicting Zenith Tropospheric Delay (ZTD) and relative ZTD (rZTD) between Continuous Operating Reference Stations (CORS) in near real-time. It is anticipated that the predicted ZTD or rZTD can assist the network-based Real-Time Kinematic (RTK) performance over long inter-station distances, ultimately, enabling a cost effective method of delivering precise positioning services to sparsely populated regional areas, such as Queensland. This research firstly investigates two ZTD solutions: 1) the post-processed IGS ZTD solution and 2) the near Real-Time ZTD solution. The near Real-Time solution is obtained through the GNSS processing software package (Bernese) that has been deployed for this project. The predictability of the near Real-Time Bernese solution is analyzed and compared to the post-processed IGS solution where it acts as the benchmark solution. The predictability analyses were conducted with various prediction time of 15, 30, 45, and 60 minutes to determine the error with respect to timeliness. The predictability of ZTD and relative ZTD is determined (or characterized) by using the previously estimated ZTD as the predicted ZTD of current epoch. This research has shown that both the ZTD and relative ZTD predicted errors are random in nature; the STD grows from a few millimeters to sub-centimeters while the predicted delay interval ranges from 15 to 60 minutes. Additionally, the RZTD predictability shows very little dependency on the length of tested baselines of up to 1000 kilometers. Finally, the comparison of near Real-Time Bernese solution with IGS solution has shown a slight degradation in the prediction accuracy. The less accurate NRT solution has an STD error of 1cm within the delay of 50 minutes. However, some larger errors of up to 10cm are observed.
Resumo:
Computer systems have become commonplace in most SMEs and technology is increasingly becoming a part of doing business. In recent years, the Internet has become readily available to businesses; consequently there has been growing pressure on SMEs to take up e-commerce. However, e-commerce is perceived by many as being unproven in terms of business benefit. This research aims to determine what, if any, benefits are derived from assimilating e-commerce technologies into SME business processes. This paper presents three in-depth case studies from the Real Estate industry in a regional setting. Overall, findings were positive and identified the following experiences: enhanced business efficiencies, cost benefits, improved customer interactions and increased business return on investment.
Resumo:
In this paper we describe the recent development of a low-bandwidth wireless camera sensor network. We propose a simple, yet effective, network architecture which allows multiple cameras to be connected to the network and synchronize their communication schedules. Image compression of greater than 90% is performed at each node running on a local DSP coprocessor, resulting in nodes using 1/8th the energy compared to streaming uncompressed images. We briefly introduce the Fleck wireless node and the DSP/camera sensor, and then outline the network architecture and compression algorithm. The system is able to stream color QVGA images over the network to a base station at up to 2 frames per second. © 2007 IEEE.
Resumo:
A road traffic noise prediction model (ASJ MODEL-1998) has been integrated with a road traffic simulator (AVENUE) to produce the Dynamic areawide Road traffic NoisE simulator-DRONE. This traffic-noise-GIS based integrated tool is upgraded to predict noise levels in built-up areas. The integration of traffic simulation with a noise model provides dynamic access to traffic flow characteristics and hence automated and detailed predictions of traffic noise. The prediction is not only on the spatial scale but also on temporal scale. The linkage with GIS gives a visual representation to noise pollution in the form of dynamic areawide traffic noise contour maps. The application of DRONE on a real world built-up area is also presented.
Resumo:
This paper discusses a new paradigm of real-time simulation of power systems in which equipment can be interfaced with a real-time digital simulator. In this scheme, one part of a power system can be simulated by using a real-time simulator; while the other part is implemeneted as a physical system. The only interface of the physical system with the computer-based simulator is through data-acquisition system. The physical system is driven by a voltage-source converter (VSC)that mimics the power system simulated in the real-time simulator. In this papar, the VSC operates in a voltage-control mode to track the point of common coupling voltage signal supplied by the digital simulator. This type of splitting a network in two parts and running a real-time simulation with a physical system in parallel is called a power network in loop here. this opens up the possibility of study of interconnection o f one or several distributed generators to a complex power network. The proposed implementation is verified through simulation studies using PSCAD/EMTDC and through hardware implementation on a TMS320G2812 DSP.
Resumo:
This paper anatomises emerging developments in online community engagement in a major global industry: real estate. Economists argue that we are entering a ‘social network economy’ in which ‘complex social networks’ govern consumer choice and product value. In the light of this, organisations are shifting from thinking and behaving in the conventional ‘value chain’ model--in which exchanges between firms and customers are one-way only, from the firm to the consumer--to the ‘value ecology’ model, in which consumers and their networks become co-creators of the value of the product. This paper studies the way in which the global real estate industry is responding to this environment. This paper identifies three key areas in which online real estate ‘value ecology’ work is occurring: real estate social networks, games, and locative media / augmented reality applications. Uptake of real estate applications is, of course, user-driven: the paper not only highlights emerging innovations; it also identifies which of these innovations are actually being taken up by users, and the content contributed as a result. The paper thus provides a case study of one major industry’s shift into a web 2.0 communication model, focusing on emerging trends and issues.
Resumo:
Last week I called the Australian federal campaign the Inception election. As we lurch toward voting day on August 21, reality has tried to kick in, but to little avail. The two leaders, Prime Minister Julia Gillard (Labor) and challenger Tony Abbott (Liberal), both of whom recently toppled their predecessors in party-room coups, are now frantically searching for their own identity. And that’s what the election itself is increasingly about. Even though both have substantial track records as ministers, they are untried as national leaders. The real conundrum of the campaign – for them, if not for voters – is: Who the heck are these people?