900 resultados para Simultaneous multithreading


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human-technology nexus is a strong focus of Information Systems (IS) research; however, very few studies have explored this phenomenon in anaesthesia. Anaesthesia has a long history of adoption of technological artifacts, ranging from early apparatus to present-day information systems such as electronic monitoring and pulse oximetry. This prevalence of technology in modern anaesthesia and the rich human-technology relationship provides a fertile empirical setting for IS research. This study employed a grounded theory approach that began with a broad initial guiding question and, through simultaneous data collection and analysis, uncovered a core category of technology appropriation. This emergent basic social process captures a central activity of anaesthestists and is supported by three major concepts: knowledge-directed medicine, complementary artifacts and culture of anaesthesia. The outcomes of this study are: (1) a substantive theory that integrates the aforementioned concepts and pertains to the research setting of anaesthesia and (2) a formal theory, which further develops the core category of appropriation from anaesthesia-specific to a broader, more general perspective. These outcomes fulfill the objective of a grounded theory study, being the formation of theory that describes and explains observed patterns in the empirical field. In generalizing the notion of appropriation, the formal theory is developed using the theories of Karl Marx. This Marxian model of technology appropriation is a three-tiered theoretical lens that examines appropriation behaviours at a highly abstract level, connecting the stages of natural, species and social being to the transition of a technology-as-artifact to a technology-in-use via the processes of perception, orientation and realization. The contributions of this research are two-fold: (1) the substantive model contributes to practice by providing a model that describes and explains the human-technology nexus in anaesthesia, and thereby offers potential predictive capabilities for designers and administrators to optimize future appropriations of new anaesthetic technological artifacts; and (2) the formal model contributes to research by drawing attention to the philosophical foundations of appropriation in the work of Marx, and subsequently expanding the current understanding of contemporary IS theories of adoption and appropriation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in safety research—trying to improve the collective understanding of motor vehicle crash causation—rests upon the pursuit of numerous lines of inquiry. The research community has focused on analytical methods development (negative binomial specifications, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might think of different lines of inquiry in terms of ‘low lying fruit’—areas of inquiry that might provide significant improvements in understanding crash causation. It is the contention of this research that omitted variable bias caused by the exclusion of important variables is an important line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant ability to better understand contributing factors to crashes. This study—believed to represent a unique contribution to the safety literature—develops and examines the role of a sizeable set of spatial variables in intersection crash occurrence. In addition to commonly considered traffic and geometric variables, examined spatial factors include local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools. The results indicate that inclusion of these factors results in significant improvement in model explanatory power, and the results also generally agree with expectation. The research illuminates the importance of spatial variables in safety research and also the negative consequences of their omissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A substantial body of research is focused on understanding the relationships between socio-demographics, land-use characteristics, and mode specific attributes on travel mode choice and time-use patterns. Residential and commercial densities, inter-mixing of land uses, and route directness in conjunction with transportation performance characteristics interact to influence accessibility to destinations as well as time spent traveling and engaging in activities. This study uniquely examines the activity durations undertaken for out-of-home subsistence; maintenance, and discretionary activities. Also examined are total tour durations (summing all activity categories within a tour). Cross-sectional activities are obtained from household activity travel survey data from the Atlanta Metropolitan Region. Time durations allocated to weekdays and weekends are compared. The censoring and endogeneity between activity categories and within individuals are captured using multiple equations Tobit models. The analysis and modeling reveal that land-use characteristics such as net residential density and the number of commercial parcels within a kilometer of a residence are associated with differences in weekday and weekend time-use allocations. Household type and structure are significant predictors across the three activity categories, but not for overall travel times. Tour characteristics such as time-of-day and primary travel mode of the tours also affect traveler's out-of-home activity-tour time-use patterns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to examine time allocation patterns within household-level trip-chaining, simultaneous doubly-censored Tobit models are applied to model time-use behavior within the context of household activity participation. Using the entire sample and a sub-sample of worker households from Tucson's Household Travel Survey, two sets of models are developed to better understand the phenomena of trip-chaining behavior among five types of households: single non-worker households, single worker households, couple non-worker households, couple one-worker households, and couple two-worker households. Durations of out-of-home subsistence, maintenance, and discretionary activities within trip chains are examined. Factors found to be associated with trip-chaining behavior include intra-household interactions with the household types and their structure and household head attributes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The influence of biogenic particle formation on climate is a well recognised phenomenon. To understand the mechanisms underlying the biogenic particle formation, determining the chemical composition of the new particles and therefore the species that drive the particle production is of utmost importance. Due to the very small amount of mass involved, indirect approaches are frequently used to infer the composition. We present here the results of such an indirect approach by simultaneously measuring volatile and hygroscopic properties of newly formed particles in a forest environment. It is shown that the particles are composed of both sulphates and organics, with the amount of sulphate component strongly depending on the available gas-phase sulphuric acid, and the organic components having the same volatility and hygroscopicity as photooxidation products of a monoterpene such as α-pinene. Our findings agree with a two-step process through nucleation and cluster formation followed by simultaneous growth by condensation of sulphates and organics that take the particles to climatically relevant sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of informed learning, applicable in academic, workplace and community settings, has been derived largely from a program of phenomenographic research in the field of information literacy, which has illuminated the experience of using information to learn. Informed learning is about simultaneous attention to information use and learning, where both information and learning are considered to be relational; and is built upon a series of key concepts such as second–order perspective, simultaneity, awareness, and relationality. Informed learning also relies heavily on reflection as a strategy for bringing about learning. As a pedagogical construct, informed learning supports inclusive curriculum design and implementation. This paper reports aspects of the informed learning research agenda which are currently being pursued at the Queensland University of Technology (QUT). The first part elaborates the idea of informed learning, examines the key concepts underpinning this pedagogical construct, and explains its emergence from the research base of the QUT Information Studies research team. The second presents a case, which demonstrates the ongoing development of informed learning theory and practice, through the development of inclusive informed learning for a culturally diverse higher education context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Streaming SIMD extension (SSE) is a special feature that is available in the Intel Pentium III and P4 classes of microprocessors. As its name implies, SSE enables the execution of SIMD (Single Instruction Multiple Data) operations upon 32-bit floating-point data therefore, performance of floating-point algorithms can be improved. In electrified railway system simulation, the computation involves the solving of a huge set of simultaneous linear equations, which represent the electrical characteristic of the railway network at a particular time-step and a fast solution for the equations is desirable in order to simulate the system in real-time. In this paper, we present how SSE is being applied to the railway network simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early childhood education and care (ECEC) in Australia are currently a focus of social and economic policy. However, early childhood leadership in Australia is yet to develop a clear identity that will enable the field to develop to its full potential. In this paper we investigate a unique theoretical framework for constructing leadership identity, based on transformational leadership and epistemological beliefs. Using semistructured interviews, 15 childcare directors from a large metropolitan area in Australia were asked to describe their beliefs about knowing in the context of their leadership practices. The findings showed that leaders (n = 5) who espoused predominantly evaluativist beliefs about knowing were more likely to describe transformational leadership behaviours in the context of childcare leadership. A number of leaders held mixed beliefs (n = 9) about knowing and described their leadership practice in ways that reflected both transactional and transformational leadership styles. Finally, one leader described predominantly objectivist epistemological beliefs and transactional beliefs about leadership. These preliminary findings show that there seems to be a relationship between core epistemological beliefs and beliefs about leadership practices and offers a new way to characterise leadership in ECEC in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the content, origin and development of Tendering Theory as a theory of price determination. It demonstrates how tendering theory determines market prices and how it is different from game and decision theories, and that in the tendering process, with non-cooperative, simultaneous, single sealed bids with individual private valuations, extensive public information, a large number of bidders and a long sequence of tendering occasions, there develops a competitive equilibrium. The development of a competitive equilibrium means that the concept of the tender as the sum of a valuation and a strategy, which is at the core of tendering theory, cannot be supported and that there are serious empirical, theoretical and methodological inconsistencies in the theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aluminate hydrotalcites are proposed to have either of the following formulas: Mg4Al2(OH)12(CO3 2-)·xH2O or Mg4Al2(OH)12(CO3 2-, SO4 2-)·xH2O. A pure hydrotalcite phase forms when magnesium chloride and aluminate solns. are mixed at a 1:1 volumetric ratio at pH 14. The synthesis of the aluminate hydrotalcites using seawater results in the formation of an impurity phase bayerite. Two decompn. steps have been identified for the aluminate hydrotalcites: (1) removal of interlayer water (230 °C) and (2) simultaneous dehydroxylation and decarbonation (330 °C).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a novel probabilistic approach to incorporating odometric information into appearance-based SLAM systems, without performing metric map construction or calculating relative feature geometry. The proposed system, dubbed Continuous Appearance-based Trajectory SLAM (CAT-SLAM), represents location as a probability distribution along a trajectory, and represents appearance continuously over the trajectory rather than at discrete locations. The distribution is evaluated using a Rao-Blackwellised particle filter, which weights particles based on local appearance and odometric similarity and explicitly models both the likelihood of revisiting previous locations and visiting new locations. A modified resampling scheme counters particle deprivation and allows loop closure updates to be performed in constant time regardless of map size. We compare the performance of CAT-SLAM to FAB-MAP (an appearance-only SLAM algorithm) in an outdoor environment, demonstrating a threefold increase in the number of correct loop closures detected by CAT-SLAM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scalable high-resolution tiled display walls are becoming increasingly important to decision makers and researchers because high pixel counts in combination with large screen areas facilitate content rich, simultaneous display of computer-generated visualization information and high-definition video data from multiple sources. This tutorial is designed to cater for new users as well as researchers who are currently operating tiled display walls or 'OptiPortals'. We will discuss the current and future applications of display wall technology and explore opportunities for participants to collaborate and contribute in a growing community. Multiple tutorial streams will cover both hands-on practical development, as well as policy and method design for embedding these technologies into the research process. Attendees will be able to gain an understanding of how to get started with developing similar systems themselves, in addition to becoming familiar with typical applications and large-scale visualisation techniques. Presentations in this tutorial will describe current implementations of tiled display walls that highlight the effective usage of screen real-estate with various visualization datasets, including collaborative applications such as visualcasting, classroom learning and video conferencing. A feature presentation for this tutorial will be given by Jurgen Schulze from Calit2 at the University of California, San Diego. Jurgen is an expert in scientific visualization in virtual environments, human-computer interaction, real-time volume rendering, and graphics algorithms on programmable graphics hardware.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A series of kaolinite-potassium acetate intercalation composite was prepared. The thermal behavior and decomposition of these composites were investigated by simultaneous differential scanning calorimetry-thermogravimetric analysis (DSC-TGA), X-ray diffraction (XRD) and Fourier-transformation infrared (FT-IR). The XRD pattern at room temperature indicated that intercalation of potassium acetate into kaolinite causes an increase of the basal spacing from 0.718 to 1.428nm. The peak intensity of the expanded phase of the composite decreased with heating above 300°C, and the basal spacing reduced to 1.19nm at 350°C and 0.718nm at 400°C. These were supported by DSC-TGA and FT-IR measurements, where the endothermic reactions are observed between 300 and 600°C. These reactions can be divided into two stages: 1) Removal of the intercalated molecules between 300-400°C. 2) Dehydroxylation of kaolinite between 400-600°C. Significant changes were observed in the infrared bands assigned to outer surface hydroxyl, inner surface hydroxyl, inner hydroxyl and hydrogen bands.