692 resultados para photon transport theory

em Queensland University of Technology - ePrints Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unlike standard applications of transport theory, the transport of molecules and cells during embryonic development often takes place within growing multidimensional tissues. In this work, we consider a model of diffusion on uniformly growing lines, disks, and spheres. An exact solution of the partial differential equation governing the diffusion of a population of individuals on the growing domain is derived. Using this solution, we study the survival probability, S(t). For the standard nongrowing case with an absorbing boundary, we observe that S(t) decays to zero in the long time limit. In contrast, when the domain grows linearly or exponentially with time, we show that S(t) decays to a constant, positive value, indicating that a proportion of the diffusing substance remains on the growing domain indefinitely. Comparing S(t) for diffusion on lines, disks, and spheres indicates that there are minimal differences in S(t) in the limit of zero growth and minimal differences in S(t) in the limit of fast growth. In contrast, for intermediate growth rates, we observe modest differences in S(t) between different geometries. These differences can be quantified by evaluating the exact expressions derived and presented here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Focuses on a study which introduced an iterative modeling method that combines properties of ordinary least squares (OLS) with hierarchical tree-based regression (HTBR) in transportation engineering. Information on OLS and HTBR; Comparison and contrasts of OLS and HTBR; Conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose–The aims of this paper are to demonstrate the application of Sen’s theory of well-being, the capability approach; to conceptualise the state of transportation disadvantage; and to underpin a theoretical sounds indicator selection process. Design/methodology/approach–This paper reviews and examines various measurement approaches of transportation disadvantage in order to select indicators and develop an innovative framework of urban transportation disadvantage. Originality/value–The paper provides further understanding of the state of transportation disadvantage from the capability approach perspective. In addition, building from this understanding, a validated and systematic framework is developed to select relevant indicators. Practical implications –The multi-indicator approach has a high tendency to double count for transportation disadvantage, increase the number of TDA population and only accounts each indicator for its individual effects. Instead, indicators that are identified based on a transportation disadvantage scenario will yield more accurate results. Keywords – transport disadvantage, the capability approach, accessibility, measuring urban transportation disadvantage, indicators selection Paper type – Academic Research Paper

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adolescent injury is a significant health concern and can be a result of the adolescents engagement in transport-related behaviours. There is however significant planning and formative research needed to inform prevention programme design. This presentation reports on the development and evaluation of a curriculum programme that was shown to be effective in reducing transport-related risks and injuries. Early adolescents report injuries resulting from a number of transport-related behaviours including those associated with riding a bicycle, a motorcycle, and as a passenger (survey of 209 Year 9 students). In focus groups, students (n=30) were able to describe the context of transport risks and injuries. Such information provided evidence of the need for an intervention and ecologically valid data on which to base programme design including insights into the language, culture and development of adolescents and their experiences with transport risks. Additional information about teaching practices and implementation issues were explored in interviews with 13 teachers. A psychological theory was selected to operationalise the design of the programmes that drew on such preparatory data. The programme, Skills for Preventing Injury in Youth was evaluated with 197 participating and 137 control students (13–14 year olds). Results showed a significant difference between the intervention and control groups from baseline to 6-month follow-up in a number of transport-related risk behaviours and transport-related injuries. The programme thus demonstrated potential in reduce early adolescents transport risk behaviours and associated harm. Discussion will involve the implications of the development research process in designing road safety interventions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analytical solution for steady-state oxygen transport in soils including 2 sink terms, viz roots and microbes with the corresponding vertical distribution scaling lengths forming a ratio p, showed p governed the critical air-filled porosity, θc, needed by most plants. For low temperature and p, θc was <0.1 but at higher temperatures and p = 1, θc was >0.15 m3/m3. When root length density at the surface was 104 m/m3 and p > 3, θc was 0.25 m3/m3, more than half the pore space. Few combinations of soil and climate regularly meet this condition. However, for sandy soils and seasonally warm, arid regions, the theory is consistent with observation, in that plants may have some deep roots. Critical θc values are used to formulate theoretical solutions in a forward mode, so different levels of oxygen uptake by roots may be compared to microbial activity. The proportion of respiration by plant roots increases rapidly with p up to p ≈2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most individuals travel in order to participate in a network of activities which are important for attaining a good standard of living. Because such activities are commonly widely dispersed and not located locally, regular access to a vehicle is important to avoid exclusion. However, planning transport system provisions that can engage members of society in an acceptable degree of activity participation remains a great challenge. The main challenges in most cities of the world are due to significant population growth and rapid urbanisation which produces increased demand for transport. Keeping pace with these challenges in most urban areas is difficult due to the widening gap between supply and demand for transport systems which places the urban population at a transport disadvantage. The key element in mitigating the issue of urban transport disadvantage is to accurately identify the urban transport disadvantaged. Although wide-ranging variables and multi-dimensional methods have been used to identify this group, variables are commonly selected using ad-hoc techniques and unsound methods. This poses questions of whether the current variables used are accurately linked with urban transport disadvantage, and the effectiveness of the current policies. To fill these gaps, the research conducted for this thesis develops an operational urban transport disadvantage framework (UTDAF) based on key statistical urban transport disadvantage variables to accurately identify the urban transport disadvantaged. The thesis develops a methodology based on qualitative and quantitative statistical approaches to develop an urban transport disadvantage framework designed to accurately identify urban transport disadvantage. The reliability and the applicability of the methodology developed is the prime concern rather than the accuracy of the estimations. Relevant concepts that impact on urban transport disadvantage identification and measurement and a wide range of urban transport disadvantage variables were identified through a review of the existing literature. Based on the reviews, a conceptual urban transport disadvantage framework was developed based on the causal theory. Variables identified during the literature review were selected and consolidated based on the recommendations of international and local experts during the Delphi study. Following the literature review, the conceptual urban transport disadvantage framework was statistically assessed to identify key variables. Using the statistical outputs, the key variables were weighted and aggregated to form the UTDAF. Before the variable's weights were finalised, they were adjusted based on results of correlation analysis between elements forming the framework to improve the framework's accuracy. The UTDAF was then applied to three contextual conditions to determine the framework's effectiveness in identifying urban transport disadvantage. The development of the framework is likely to be a robust application measure for policy makers to justify infrastructure investments and to generate awareness about the issue of urban transport disadvantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances made within the aviation industry over the past several decades have significantly improved the availability, affordability and convenience of air travel and have been greatly beneficial in both social and economic terms. Air transport has developed into an irreplaceable service being relied on by millions of people each day and as such airports have become critical elements of national infrastructure to facilitate the movement of people and goods. As components of critical infrastructure (CI), airports are integral parts of a national economy supporting regional as well as national trade, commercial activity and employment. Therefore, any disruption or crisis which impacts the continuity of operations at airports can have significant negative consequences for the airport as a business, for the local economy and other nodes of transport infrastructure as well as for society. Due to the highly dynamic and volatile environment in which airports operate in, the aviation industry has faced many different challenges over the years ranging from terrorist attacks such as September 11, to health crises such as the SARS epidemic to system breakdowns such as the recent computer system outage at Virgin Blue Airlines in Australia. All these events have highlighted the vulnerability of airport systems to a range of disturbances as well as the gravity and widespread impact of any kind of discontinuity in airport functions. Such incidents thus emphasise the need for increasing resilience and reliability of airports and ensuring business continuity in the event of a crisis...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review focuses on one of the fundamental phenomena that occur upon application of sufficiently strong electric fields to gases, namely the formation and propagation of ionization waves-streamers. The dynamics of streamers is controlled by strongly nonlinear coupling, in localized streamer tip regions, between enhanced (due to charge separation) electric field and ionization and transport of charged species in the enhanced field. Streamers appear in nature (as initial stages of sparks and lightning, as huge structures-sprites above thunderclouds), and are also found in numerous technological applications of electrical discharges. Here we discuss the fundamental physics of the guided streamer-like structures-plasma bullets which are produced in cold atmospheric-pressure plasma jets. Plasma bullets are guided ionization waves moving in a thin column of a jet of plasma forming gases (e.g.,He or Ar) expanding into ambient air. In contrast to streamers in a free (unbounded) space that propagate in a stochastic manner and often branch, guided ionization waves are repetitive and highly-reproducible and propagate along the same path-the jet axis. This property of guided streamers, in comparison with streamers in a free space, enables many advanced time-resolved experimental studies of ionization waves with nanosecond precision. In particular, experimental studies on manipulation of streamers by external electric fields and streamer interactions are critically examined. This review also introduces the basic theories and recent advances on the experimental and computational studies of guided streamers, in particular related to the propagation dynamics of ionization waves and the various parameters of relevance to plasma streamers. This knowledge is very useful to optimize the efficacy of applications of plasma streamer discharges in various fields ranging from health care and medicine to materials science and nanotechnology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product Ecosystem theory is an emerging theory that shows that disruptive “game changing” innovation is only possible when the entire ecosystem is considered. When environmental variables change faster than products or services can adapt, disruptive innovation is required to keep pace. This has many parallels with natural ecosystems where species that cannot keep up with changes to the environment will struggle or become extinct. In this case the environment is the city, the environmental pressures are pollution and congestion, the product is the car and the product ecosystem is comprised of roads, bridges, traffic lights, legislation, refuelling facilities etc. Each one of these components is the responsibility of a different organisation and so any change that affects the whole ecosystem requires a transdisciplinary approach. As a simple example, cars that communicate wirelessly with traffic lights are only of value if wireless-enabled traffic lights exist and vice versa. Cars that drive themselves are technically possible but legislation in most places doesn’t allow their use. According to innovation theory, incremental innovation tends to chase ever diminishing returns and becomes increasingly unable to tackle the “big issues.” Eventually “game changing” disruptive innovation comes along and solves the “big issues” and/or provides new opportunities. Seen through this lens, the environmental pressures of urban traffic congestion and pollution are the “big issues.” It can be argued that the design of cars and the other components of the product ecosystem follow an incremental innovation approach. That is why the “big issues” remain unresolved. This paper explores the problems of pollution and congestion in urban environments from a Product Ecosystem perspective. From this a strategy will be proposed for a transdisciplinary approach to develop and implement solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the question why the automotive mode and the large technological system it creates, continues to dominate urban transport systems despite the availability of more cost-efficient alternatives. A number of theoretical insights are developed into the way these losses evolve from path dependent growth, and lead to market failure and lock-in. The important role of asymmetries of influence is highlighted. A survey of commuters in Jakarta Indonesia is used to provide a measure of transport modal lock-in (TML) in a developing country conurbation. A discrete choice experiment is used to provide evidence for the thesis central hypothesis that in such conurbations there is a high level of commuter awareness of the negative externalities generated by TML which can produce a strong level of support for its reversal. Why TML nevertheless remains a strong and durable feature of the transport system is examined with reference to the role of asymmetries of influence.