921 resultados para Sophisticated Instruments
Resumo:
The perennial issues of student engagement, success and retention in higher education continue to attract attention as the salience of teaching and learning funding and performance measures has increased. This paper addresses the question of the responsibility or place of higher education institutions (HEIs) for initiating, planning, managing and evaluating their student engagement, success and retention programs and strategies. An evaluation of the current situation indicates the need for a sophisticated approach to assessing the ability of HEIs to proactively design programs and practices that enhance student engagement. An approach—the Student Engagement Success and Retention Maturity Model (SESR-MM)—is proposed and its development, current status, and relationship with and possible use in benchmarking are discussed.
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
Situated on Youtube, and shown in various locations. In this video we show a 3D mock up of a personal house purchasing process. A path traversal metaphor is used to give a sense of progression along the process stages. The intention is to be able to use console devices like an Xbox to consume business processes. This is so businesses can expose their internal processes to consumers using sophisticated user interfaces. The demonstrator was developed using Microsoft XNA, with assistance from the Suncorp Bank and the Smart Services CRC. More information at: www.bpmve.org
Resumo:
Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.
Resumo:
Vessel-source marine pollution is one of the main sources of marine pollution in Bangladesh. Due to unfettered operation of vessels, the country has been exposed to massive pollution that is causing a serious imbalance in the marine environment. Against this backdrop, this article seeks to demonstrate that the regulatory system of Bangladesh should be strengthened and made more effective in the light of international instruments to ensure the conservation and sustainable management of its marine environment. With this aim the article examines the present status of implementation of the MARPOL Convention in Bangladesh
Resumo:
Background There is growing consensus that a multidisciplinary, comprehensive and standardised process for assessing the fitness of older patients for chemotherapy should be undertaken to determine appropriate cancer treatment. Aim This study tested a model of cancer care for the older patient incorporating Comprehensive Geriatric Assessment (CGA), which aimed to ensure that 'fit' individuals amenable to active treatment were accurately identified; 'vulnerable' patients more suitable for modified or supportive regimens were determined; and 'frail 'individuals who would benefit most from palliative regimens were also identified and offered the appropriate level of care. Methods A consecutive-series n=178 sample of patients >65 years was recruited from a major Australian cancer centre. The following instruments were administered by an oncogeriatric nurse prior to treatment: Vulnerable Elders Survey-13; Cumulative Illness Rating Scale (Geriatric); Malnutrition Screening Tool; Mini-mental State Examination; Geriatric Depression Scale; Barthel Index; and Lawton Instrumental Activities of Daily Living Scale. Scores from these instruments were aggregated to predict patient fitness, vulnerability or frailty for chemotherapy. Physicians provided a concurrent (blinded) prediction of patient fitness, vulnerability or frailty based on their clinical assessment. Data were also collected on actual patient outcomes (eg treatment completed as predicted, treatment reduced) during monthly audits of patient trajectories. Data analysis Data analysis is underway. A sample of 178 is adequate to detect, with 90% power, kappa coefficients of agreement between CGA and physician assessments of K>0.90 ("almost perfect agreement"). Primary endpoints comprise a) whether the nurse-led CGA determination of fit, vulnerable or frail agrees with the oncologist's assessments of fit, vulnerable or frail and b) whether the CGA and physician assessments accurately predict actual patient outcomes. Conclusion An oncogeriatric nurse-led model of care is currently being developed from the results. We conclude with a discussion of the pivotal role of nurses in CGA-based models of care.