964 resultados para Probable Number Technique
Resumo:
Size distributions of expiratory droplets expelled during coughing and speaking and the velocities of the expiration air jets of healthy volunteers were measured. Droplet size was measured using the Interferometric Mie imaging (IMI) technique while the Particle Image Velocimetry (PIV) technique was used for measuring air velocity. These techniques allowed measurements in close proximity to the mouth and avoided air sampling losses. The average expiration air velocity was 11.7 m/s for coughing and 3.9 m/s for speaking. Under the experimental setting, evaporation and condensation effects had negligible impact on the measured droplet size. The geometric mean diameter of droplets from coughing was 13.5m and it was 16.0m for speaking (counting 1 to 100). The estimated total number of droplets expelled ranged from 947 – 2085 per cough and 112 – 6720 for speaking. The estimated droplet concentrations for coughing ranged from 2.4 - 5.2cm-3 per cough and 0.004 – 0.223 cm-3 for speaking.
Resumo:
A novel technique was used to measure emission factors for commonly used commercial aircraft including a range of Boeing and Airbus airframes under real world conditions. Engine exhaust emission factors for particles in terms of particle number and mass (PM2.5), along with those for CO2, and NOx were measured for over 280 individual aircraft during the various modes of landing/takeoff (LTO) cycle. Results from this study show that particle number, and NOx emission factors are dependant on aircraft engine thrust level. Minimum and maximum emissions factors for particle number, PM2.5, and NOx emissions were found to be in the range of 4.16×1015-5.42×1016 kg-1, 0.03-0.72 g.kg-1, and 3.25-37.94 g.kg-1 respectively for all measured airframes and LTO cycle modes. Number size distributions of emitted particles for the naturally diluted aircraft plumes in each mode of LTO cycle showed that particles were predominantly in the range of 4 to 100 nm in diameter in all cases. In general, size distributions exhibit similar modality during all phases of the LTO cycle. A very distinct nucleation mode was observed in all particle size distributions, except for taxiing and landing of A320 aircraft. Accumulation modes were also observed in all particle size distributions. Analysis of aircraft engine emissions during LTO cycle showed that aircraft thrust level is considerably higher during taxiing than idling suggesting that International Civil Aviation Organization (ICAO) standards need to be modified as the thrust levels for taxi and idle are considered to be the same (7% of total thrust) [1].
Resumo:
The measurement of Cobb angles from radiographs is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. Between November 2008 and December 2008 20 patients were selected at random from the Paediatric Spine Research Groups Database. A power calculation was performed which indicated if n=240 measurements the study had a 96% chance of detecting a 5 degree difference between groups. All patients had idiopathic scoliosis with a range of curve types and severities. The study found the i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.
Resumo:
Effects of pedestrian movement on multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) channel capacity have been investigated using experiment and simulation. The experiment was conducted at 5.2 GHz by a MIMO-OFDM packet transmission demonstrator using four transmitters and four receivers built in-house. Geometric optics based ray tracing technique was used to simulate the experimental scenarios. Changes in the channel capacity dynamic range have been analysed for different number of pedestrian (0-3) and antennas (2-4). Measurement and simulation results show that the dynamic range increases with the number of pedestrian and the number of antennas on the transmitter and receiver array.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.
Resumo:
Quantitative behaviour analysis requires the classification of behaviour to produce the basic data. In practice, much of this work will be performed by multiple observers, and maximising inter-observer consistency is of particular importance. Another discipline where consistency in classification is vital is biological taxonomy. A classification tool of great utility, the binary key, is designed to simplify the classification decision process and ensure consistent identification of proper categories. We show how this same decision-making tool - the binary key - can be used to promote consistency in the classification of behaviour. The construction of a binary key also ensures that the categories in which behaviour is classified are complete and non-overlapping. We discuss the general principles of design of binary keys, and illustrate their construction and use with a practical example from education research.
Resumo:
For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.
Resumo:
We present a method for topological SLAM that specifically targets loop closing for edge-ordered graphs. Instead of using a heuristic approach to accept or reject loop closing, we propose a probabilistically grounded multi-hypothesis technique that relies on the incremental construction of a map/state hypothesis tree. Loop closing is introduced automatically within the tree expansion, and likely hypotheses are chosen based on their posterior probability after a sequence of sensor measurements. Careful pruning of the hypothesis tree keeps the growing number of hypotheses under control and a recursive formulation reduces storage and computational costs. Experiments are used to validate the approach.
Resumo:
The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.
Resumo:
Principal Topic The study of the origin and characteristics of venture ideas - or ''opportunities'' as they are often called - and their contextual fit are key research goals in entrepreneurship (Davidsson, 2004). We define venture idea as ''the core ideas of an entrepreneur about what to sell, how to sell, whom to sell and how an entrepreneur acquire or produce the product or service which he/she sells'' for the purpose of this study. When realized the venture idea becomes a ''business model''. Even though venture ideas are central to entrepreneurship yet its characteristics and their effect to the entrepreneurial process is mysterious. According to Schumpeter (1934) entrepreneurs could creatively destruct the existing market condition by introducing new product/service, new production methods, new markets, and new sources of supply and reorganization of industries. The introduction, development and use of new ideas are generally called as ''innovation'' (Damanpour & Wischnevsky, 2006) and ''newness'' is a property of innovation and is a relative term which means that the degree of unfamiliarity of venture idea either to a firm or to a market. However Schumpeter's (1934) discusses five different types of newness, indicating that type of newness is an important issue. More recently, Shane and Venkataraman (2000) called for research taking into consideration not only the variation of characteristics of individuals but also heterogeneity of venture ideas, Empirically, Samuelson (2001, 2004) investigated process differences between innovative venture ideas and imitative venture ideas. However, he used only a crude dichotomy regarding the venture idea newness. According to Davidsson, (2004) as entrepreneurs could introduce new economic activities ranging from pure imitation to being new to the entire world market, highlighting that newness is a matter of degree. Dahlqvist (2007) examined the venture idea newness and made and attempt at more refined assessment of the degree and type of newness of venture idea. Building on these predecessors our study refines the assessment of venture idea newness by measuring the degree of venture idea newness (new to the world, new to the market, substantially improved while not entirely new, and imitation) for four different types of newness (product/service, method of production, method of promotion, and customer/target market). We then related type and degree of newness to the pace of progress in nascent venturing process. We hypothesize that newness will slow down the business creation process. Shane & Venkataraman (2000) introduced entrepreneurship as the nexus of opportunities and individuals. In line with this some scholars has investigated the relationship between individuals and opportunities. For example Shane (2000) investigates the relatedness between individuals' prior knowledge and identification of opportunities. Shepherd & DeTinne (2005) identified that there is a positive relationship between potential financial reward and the identification of innovative venture ideas. Sarasvathy's 'Effectuation Theory'' assumes high degree of relatedness with founders' skills, knowledge and resources in the selection of venture ideas. However entrepreneurship literature is scant with analyses of how this relatedness affects to the progress of venturing process. Therefore, we assess the venture ideas' degree of relatedness to prior knowledge and resources, and relate these, too, to the pace of progress in nascent venturing process. We hypothesize that relatedness will increase the speed of business creation. Methodology For this study we will compare early findings from data collected through the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE). CAUSEE is a longitudinal study whose primary objective is to uncover the factors that initiate, hinder and facilitate the process of emergence and development of new firms. Data were collected from a representative sample of some 30,000 households in Australia using random digit dialing (RDD) telephone survey interviews. Through the first round of data collection identified 600 entrepreneurs who are currently involved in the business start-up process. The unit of the analysis is the emerging venture, with the respondent acting as its spokesperson. The study methodology allows researchers to identify ventures in early stages of creation and to longitudinally follow their progression through data collection periods over time. Our measures of newness build on previous work by Dahlqvist (2007). Our adapted version was developed over two pre-tests with about 80 participants in each. The measures of relatedness were developed through the two rounds of pre-testing. The pace of progress in the venture creation process is assessed with the help of time-stamped gestation activities; a technique developed in the Panel Study of Entrepreneurial Dynamics (PSED). Results and Implications We hypothesized that venture idea newness slows down the venturing process whereas relatedness facilitates the venturing process. Results of 600 nascent entrepreneurs in Australia indicated that there is marginal support for the hypothesis that relatedness assists the gestation progress. Newness is significant but is the opposite sign to the hypothesized. The results give number of implications for researchers, business founders, consultants and policy makers in terms of better knowledge of the venture creation process.
Resumo:
The measurement of Cobb angles on radiographs of patients with spinal deformities is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. The i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.
Resumo:
Having flexible notions of the unit (e.g., 26 ones can be thought of as 2.6 tens, 1 ten 16 ones, 260 tenths, etc.) should be a major focus of elementary mathematics education. However, often these powerful notions are relegated to computations where the major emphasis is on "getting the right answer" thus procedural knowledge rather than conceptual knowledge becomes the primary focus. This paper reports on 22 high-performing students' reunitising processes ascertained from individual interviews on tasks requiring unitising, reunitising and regrouping; errors were categorised to depict particular thinking strategies. The results show that, even for high-performing students, regrouping is a cognitively complex task. This paper analyses this complexity and draws inferences for teaching.
Resumo:
Motor vehicles are major emitters of gaseous and particulate pollution in urban areas, and exposure to particulate pollution can have serious health effects, ranging from respiratory and cardiovascular disease to mortality. Motor vehicle tailpipe particle emissions span a broad size range from 0.003-10µm, and are measured as different subsets of particle mass concentrations or particle number count. However, no comprehensive inventories currently exist in the international published literature covering this wide size range. This paper presents the first published comprehensive inventory of motor vehicle tailpipe particle emissions covering the full size range of particles emitted. The inventory was developed for urban South-East Queensland by combining two techniques from distinctly different disciplines, from aerosol science and transport modelling. A comprehensive set of particle emission factors were combined with traffic modelling, and tailpipe particle emissions were quantified for particle number (ultrafine particles), PM1, PM2.5 and PM10 for light and heavy duty vehicles and buses. A second aim of the paper involved using the data derived in this inventory for scenario analyses, to model the particle emission implications of different proportions of passengers travelling in light duty vehicles and buses in the study region, and to derive an estimate of fleet particle emissions in 2026. It was found that heavy duty vehicles (HDVs) in the study region were major emitters of particulate matter pollution, and although they contributed only around 6% of total regional vehicle kilometres travelled, they contributed more than 50% of the region’s particle number (ultrafine particles) and PM1 emissions. With the freight task in the region predicted to double over the next 20 years, this suggests that HDVs need to be a major focus of mitigation efforts. HDVs dominated particle number (ultrafine particles) and PM1 emissions; and LDV PM2.5 and PM10 emissions. Buses contributed approximately 1-2% of regional particle emissions.
Increase in particle number emissions from motor vehicles due to interruption of steady traffic flow
Resumo:
We assess the increase in particle number emissions from motor vehicles driving at steady speed when forced to stop and accelerate from rest. Considering the example of a signalized pedestrian crossing on a two-way single-lane urban road, we use a complex line source method to calculate the total emissions produced by a specific number and mix of light petrol cars and diesel passenger buses and show that the total emissions during a red light is significantly higher than during the time when the light remains green. Replacing two cars with one bus increased the emissions by over an order of magnitude. Considering these large differences, we conclude that the importance attached to particle number emissions in traffic management policies be reassessed in the future.