356 resultados para Random Number Generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As part of a large study investigating indoor air in residential houses in Brisbane, Australia, the purpose of this work was to quantify indoor exposure to submicrometer particles and PM2.5 for the inhabitants of 14 houses. Particle concentrations were measured simultaneously for more than 48 hours in the kitchens of all the houses by using a condensation particle counter (CPC) and a photometer (DustTrak). The occupants of the houses were asked to fill in a diary, noting the time and duration of any activity occurring throughout the house during measurement, as well as their presence or absence from home. From the time series concentration data and the information about indoor activities, exposure to the inhabitants of the houses was calculated for the entire time they spent at home as well as during indoor activities resulting in particle generation. The results show that the highest median concentration level occurred during cooking periods for both particle number concentration (47.5´103 particles cm-3) and PM2.5 concentration (13.4 mg m-3). The highest residential exposure period was the sleeping period for both particle number exposure (31%) and PM2.5 exposure (45.6%). The percentage of the average residential particle exposure level in total 24h particle exposure level was approximating 70% for both particle number and PM2.5 exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of particle emission characteristics associated with forest fires and in general, biomass burning, is becoming increasingly important due to the impact of these emissions on human health. Of particular importance is developing a better understanding of the size distribution of particles generated from forest combustion under different environmental conditions, as well as provision of emission factors for different particle size ranges. This study was aimed at quantifying particle emission factors from four types of wood found in South East Queensland forests: Spotted Gum (Corymbia citriodora), Red Gum (Eucalypt tereticornis), Blood Gum (Eucalypt intermedia), and Iron bark (Eucalypt decorticans); under controlled laboratory conditions. The experimental set up included a modified commercial stove connected to a dilution system designed for the conditions of the study. Measurements of particle number size distribution and concentration resulting from the burning of woods with a relatively homogenous moisture content (in the range of 15 to 26 %) and for different rates of burning were performed using a TSI Scanning Mobility Particle Sizer (SMPS) in the size range from 10 to 600 nm and a TSI Dust Trak for PM2.5. The results of the study in terms of the relationship between particle number size distribution and different condition of burning for different species show that particle number emission factors and PM2.5 mass emission factors depend on the type of wood and the burning rate; fast burning or slow burning. The average particle number emission factors for fast burning conditions are in the range of 3.3 x 1015 to 5.7 x 1015 particles/kg, and for PM2.5 are in the range of 139 to 217 mg/kg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I am suspicious of tools without a purpose - tools that are not developed in response to a clearly defined problem. Of course tools without a purpose can still be useful. However the development of first generation CAD was seriously impeded because the solution came before the problem. We are in danger of repeating this mistake if we do not clarify the nature of the problem that we are trying to solve with the next generation of tools. Back in the 1980s I used to add a postscript slide at the end of CAD conference presentations and the applause would invariably turn to concern. The slide simple asked: can anyone remember what it was about design that needed aiding before we had computer aided design?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of the world’s citizens now live in cities. Although urban planning can thus be thought of as a field with significant ramifications on the human condition, many practitioners feel that it has reached the crossroads in thought leadership between traditional practice and a new, more participatory and open approach. Conventional ways to engage people in participatory planning exercises are limited in reach and scope. At the same time, socio-cultural trends and technology innovation offer opportunities to re-think the status quo in urban planning. Neogeography introduces tools and services that allow non-geographers to use advanced geographical information systems. Similarly, is there potential for the emergence of a neo-planning paradigm in which urban planning is carried out through active civic engagement aided by Web 2.0 and new media technologies thus redefining the role of practicing planners? This paper traces a number of evolving links between urban planning, neogeography and information and communication technology. Two significant trends – participation and visualisation – with direct implications for urban planning are discussed. Combining advanced participation and visualisation features, the popular virtual reality environment Second Life is then introduced as a test bed to explore a planning workshop and an integrated software event framework to assist narrative generation. We discuss an approach to harness and analyse narratives using virtual reality logging to make transparent how users understand and interpret proposed urban designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Online technological advances are pioneering the wider distribution of geospatial information for general mapping purposes. The use of popular web-based applications, such as Google Maps, is ensuring that mapping based applications are becoming commonplace amongst Internet users which has facilitated the rapid growth of geo-mashups. These user generated creations enable Internet users to aggregate and publish information over specific geographical points. This article identifies privacy invasive geo-mashups that involve the unauthorized use of personal information, the inadvertent disclosure of personal information and invasion of privacy issues. Building on Zittrain’s Privacy 2.0, the author contends that first generation information privacy laws, founded on the notions of fair information practices or information privacy principles, may have a limited impact regarding the resolution of privacy problems arising from privacy invasive geo-mashups. Principally because geo-mashups have different patterns of personal information provision, collection, storage and use that reflect fundamental changes in the Web 2.0 environment. The author concludes by recommending embedded technical and social solutions to minimize the risks arising from privacy invasive geo-mashups that could lead to the establishment of guidelines for the general protection of privacy in geo-mashups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Police services in a number of Australian states and overseas jurisdictions have begun to implement or consider random road-side drug testing of drivers. This paper outlines research conducted to provide an estimate of the extent of drug driving in a sample of Queensland drivers in regional, rural and metropolitan areas. Oral fluid samples were collected from 2657 Queensland motorists and screened for illicit substances including cannabis (delta 9 tetrahydrocannibinol [THC]), amphetamines, ecstasy, and cocaine. Overall, 3.8% of the sample (n = 101) screened positive for at least one illicit substance, although multiple drugs were identified in a sample of 23 respondents. The most common drugs detected in oral fluid were ecstasy (n = 53), and cannabis (n = 46) followed by amphetamines (n = 23). A key finding was that cannabis was confirmed as the most common self-reported drug combined with driving and that individuals who tested positive to any drug through oral fluid analysis were also more likely to report the highest frequency of drug driving. Furthermore, a comparison between drug vs. drink driving detection rates for one region of the study, revealed a higher detection rate for drug driving (3.8%) vs. drink driving (0.8%). This research provides evidence that drug driving is relatively prevalent on Queensland roads, and may in fact be more common than drink driving. This paper will further outline the study findings’ and present possible directions for future drug driving research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

'The Millennial Adolescent' offers contemporary, stimulating insights for those currently teaching as well as those preparing to teach. This book investigates the characteristics of Generation Y, using students own voices, generational theory and case studies. The text is structured around the principle that effective teachers need to know who they are teaching as well as what to teach, how to teach it, and how to assess the outcome. Using generational theory, 'The Millennial Adolescent' investigates the characteristics of Generation Y, or the Millennial Generation, and points out what all teachers need to know about working with this current generation of students who are described in a number of ways digital natives, team oriented, confident, multi-taskers, high achievers, and a generation unlike any other. The book contains well-known frameworks for developing understandings about adolescents, blended and contrasted with a contemporary socio-cultural construction of adolescence, set in our particular time, era and society. This book reflects the uniqueness of Australian contexts, while connecting with international trends and global patterns. Engaging and full of insights, this book is essential reading for all professionals dealing with adolescents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the last two decades heart disease has been the highest single cause of death for the human population. With an alarming number of patients requiring heart transplant, and donations not able to satisfy the demand, treatment looks to mechanical alternatives. Rotary Ventricular Assist Devices, VADs, are miniature pumps which can be implanted alongside the heart to assist its pumping function. These constant flow devices are smaller, more efficient and promise a longer operational life than more traditional pulsatile VADs. The development of rotary VADs has focused on single pumps assisting the left ventricle only to supply blood for the body. In many patients however, failure of both ventricles demands that an additional pulsatile device be used to support the failing right ventricle. This condition renders them hospital bound while they wait for an unlikely heart donation. Reported attempts to use two rotary pumps to support both ventricles concurrently have warned of inherent haemodynamic instability. Poor balancing of the pumps’ flow rates quickly leads to vascular congestion increasing the risk of oedema and ventricular ‘suckdown’ occluding the inlet to the pump. This thesis introduces a novel Bi-Ventricular Assist Device (BiVAD) configuration where the pump outputs are passively balanced by vascular pressure. The BiVAD consists of two rotary pumps straddling the mechanical passive controller. Fluctuations in vascular pressure induce small deflections within both pumps adjusting their outputs allowing them to maintain arterial pressure. To optimise the passive controller’s interaction with the circulation, the controller’s dynamic response is optimised with a spring, mass, damper arrangement. This two part study presents a comprehensive assessment of the prototype’s ‘viability’ as a support device. Its ‘viability’ was considered based on its sensitivity to pathogenic haemodynamics and the ability of the passive response to maintain healthy circulation. The first part of the study is an experimental investigation where a prototype device was designed and built, and then tested in a pulsatile mock circulation loop. The BiVAD was subjected to a range of haemodynamic imbalances as well as a dynamic analysis to assess the functionality of the mechanical damper. The second part introduces the development of a numerical program to simulate human circulation supported by the passively controlled BiVAD. Both investigations showed that the prototype was able to mimic the native baroreceptor response. Simulating hypertension, poor flow balancing and subsequent ventricular failure during BiVAD support allowed the passive controller’s response to be assessed. Triggered by the resulting pressure imbalance, the controller responded by passively adjusting the VAD outputs in order to maintain healthy arterial pressures. This baroreceptor-like response demonstrated the inherent stability of the auto regulating BiVAD prototype. Simulating pulmonary hypertension in the more observable numerical model, however, revealed a serious issue with the passive response. The subsequent decrease in venous return into the left heart went unnoticed by the passive controller. Meanwhile the coupled nature of the passive response not only decreased RVAD output to reduce pulmonary arterial pressure, but it also increased LVAD output. Consequently, the LVAD increased fluid evacuation from the left ventricle, LV, and so actually accelerated the onset of LV collapse. It was concluded that despite the inherently stable baroreceptor-like response of the passive controller, its lack of sensitivity to venous return made it unviable in its present configuration. The study revealed a number of other important findings. Perhaps the most significant was that the reduced pulse experienced during constant flow support unbalanced the ratio of effective resistances of both vascular circuits. Even during steady rotary support therefore, the resulting ventricle volume imbalance increased the likelihood of suckdown. Additionally, mechanical damping of the passive controller’s response successfully filtered out pressure fluctuations from residual ventricular function. Finally, the importance of recognising inertial contributions to blood flow in the atria and ventricles in a numerical simulation were highlighted. This thesis documents the first attempt to create a fully auto regulated rotary cardiac assist device. Initial results encourage development of an inlet configuration sensitive to low flow such as collapsible inlet cannulae. Combining this with the existing baroreceptor-like response of the passive controller will render a highly stable passively controlled BiVAD configuration. The prototype controller’s passive interaction with the vasculature is a significant step towards a highly stable new generation of artificial heart.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of the literature on clusters has focused on the economic advantages of clusters and how these can be achieved in terms of competition, regional development and local spillovers. Some studies have focused at the level of the individual firm however human resource management (HRM) in individual clustered firms has received scant attention. This paper innovatively utilises the extended Resource Based View (RBV) of the firm as a framework to conceptualise the human resource processes of individual firms within a cluster. RBV is argued as a useful tool as it explains external rents outside a firm’s boundaries. The paper concludes that HRM can assist in generating rents for firms and clusters more broadly when the function supports valuable interfirm relationships important for realising inter-firm advantages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principal Topic High technology consumer products such as notebooks, digital cameras and DVD players are not introduced into a vacuum. Consumer experience with related earlier generation technologies, such as PCs, film cameras and VCRs, and the installed base of these products strongly impacts the market diffusion of the new generation products. Yet technology substitution has received only sparse attention in the diffusion of innovation literature. Research for consumer durables has been dominated by studies of (first purchase) adoption (c.f. Bass 1969) which do not explicitly consider the presence of an existing product/technology. More recently, considerable attention has also been given to replacement purchases (c.f. Kamakura and Balasubramanian 1987). Only a handful of papers explicitly deal with the diffusion of technology/product substitutes (e.g. Norton and Bass, 1987: Bass and Bass, 2004). They propose diffusion-type aggregate-level sales models that are used to forecast the overall sales for successive generations. Lacking household data, these aggregate models are unable to give insights into the decisions by individual households - whether to adopt generation II, and if so, when and why. This paper makes two contributions. It is the first large-scale empirical study that collects household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in comparision to traditional analysis that evaluates technology substitution as an ''adoption of innovation'' type process, we propose that from a consumer's perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing the generation I product with generation II). Based on this proposition, we develop and test a number of hypotheses. Methodology/Key Propositions In some cases, successive generations are clear ''substitutes'' for the earlier generation, in that they have almost identical functionality. For example, successive generations of PCs Pentium I to II to III or flat screen TV substituting for colour TV. More commonly, however, the new technology (generation II) is a ''partial substitute'' for existing technology (generation I). For example, digital cameras substitute for film-based cameras in the sense that they perform the same core function of taking photographs. They have some additional attributes of easier copying and sharing of images. However, the attribute of image quality is inferior. In cases of partial substitution, some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Extensive research on innovation adoption has consistently shown consumer innovativeness is the most important consumer characteristic that drives adoption timing (Goldsmith et al. 1995; Gielens and Steenkamp 2007). Hence, we expect consumer innovativeness also to influence both additional and substitute generation II purchases. Hypothesis 1a) More innovative households will make additional generation II purchases earlier. 1 b) More innovative households will make substitute generation II purchases earlier. 1 c) Consumer innovativeness will have a stronger impact on additional generation II purchases than on substitute generation II purchases. As outlined above, substitute generation II purchases act, in part like a replacement purchase for the generation I product. Prior research (Bayus 1991; Grewal et al 2004) identified product age as the most dominant factor influencing replacements. Hence, we hypothesise that: Hypothesis 2: Households with older generation I products will make substitute generation II purchases earlier. Our survey of 8,077 households investigates their adoption of two new generation products: notebooks as a technology change to PCs, and DVD players as a technology shift from VCRs. We employ Cox hazard modelling to study factors influencing the timing of a household's adoption of generation II products. We determine whether this is an additional or substitute purchase by asking whether the generation I product is still used. A separate hazard model is conducted for additional and substitute purchases. Consumer Innovativeness is measured as domain innovativeness adapted from the scales of Goldsmith and Hofacker (1991) and Flynn et al. (1996). The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include age, size and income of household, and age and education of primary decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases (exp = 1.11) and substitute purchases (exp = 1.09). Exp is interpreted as the increased probability of purchase for an increase of 1.0 on a 7-point innovativeness scale. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD (exp = 2.92) and a strong influence for PCs/notebooks (exp = 1.30). Exp is interpreted as the increased probability of purchase for an increase of 10 years in the age of the generation I product. Yet, also as hypothesised, there was no influence on additional purchases. The results lead to two key implications. First, there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. Treating these as a single process will mask the true drivers of adoption. For substitute purchases, product age is a key driver. Hence, implications for marketers of high technology products can utilise data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.