903 resultados para Type of pregnancy


Relevância:

90.00% 90.00%

Publicador:

Resumo:

US state-based data breach notification laws have unveiled serious corporate and government failures regarding the security of personal information. These laws require organisations to notify persons who may be affected by an unauthorized acquisition of their personal information. Safe harbours to notification exist if personal information is encrypted. Three types of safe harbour have been identified in the literature: exemptions, rebuttable presumptions and factors. The underlying assumption of exemptions is that encrypted personal information is secure and therefore unauthorized access does not pose a risk. However, the viability of this assumption is questionable when examined against data breaches involving encrypted information and the demanding practical requirements of effective encryption management. Recent recommendations by the Australian Law Reform Commission (ALRC) would amend the Privacy Act 1988 (Cth) to implement a data breach scheme that includes a different type of safe harbour, factor based analysis. The authors examine the potential capability of the ALRC’s proposed encryption safe harbour in relation to the US experience at the state legislature level.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper investigates a mobile, wireless sensor/actuator network application for use in the cattle breeding industry. Our goal is to prevent fighting between bulls in on-farm breeding paddocks by autonomously applying appropriate stimuli when one bull approaches another bull. This is an important application because fighting between high-value animals such as bulls during breeding seasons causes significant financial loss to producers. Furthermore, there are significant challenges in this type of application because it requires dynamic animal state estimation, real-time actuation and efficient mobile wireless transmissions. We designed and implemented an animal state estimation algorithm based on a state-machine mechanism for each animal. Autonomous actuation is performed based on the estimated states of an animal relative to other animals. A simple, yet effective, wireless communication model has been proposed and implemented to achieve high delivery rates in mobile environments. We evaluated the performance of our design by both simulations and field experiments, which demonstrated the effectiveness of our autonomous animal control system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a special type of novel flexible structures, tensegrity holds promise for many potential applications in such fields as materials science, biomechanics, civil and aerospace engineering. Rhombic systems are an important class of tensegrity structures, in which each bar constitutes the longest diagonal of a rhombus of four strings. In this paper, we address the design methods of rhombic structures based on the idea that many tensegrity structures can be constructed by assembling one-bar elementary cells. By analyzing the properties of rhombic cells, we first develop two novel schemes, namely, direct enumeration scheme and cell-substitution scheme. In addition, a facile and efficient method is presented to integrate several rhombic systems into a larger tensegrity structure. To illustrate the applications of these methods, some novel rhombic tensegrity structures are constructed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Up to 1% of adults will suffer from leg ulceration at some time. The majority of leg ulcers are venous in origin and are caused by high pressure in the veins due to blockage or weakness of the valves in the veins of the leg. Prevention and treatment of venous ulcers is aimed at reducing the pressure either by removing / repairing the veins, or by applying compression bandages / stockings to reduce the pressure in the veins. The vast majority of venous ulcers are healed using compression bandages. Once healed they often recur and so it is customary to continue applying compression in the form of bandages, tights, stockings or socks in order to prevent recurrence. Compression bandages or hosiery (tights, stockings, socks) are often applied for ulcer prevention. Objectives To assess the effects of compression hosiery (socks, stockings, tights) or bandages in preventing the recurrence of venous ulcers. To determine whether there is an optimum pressure/type of compression to prevent recurrence of venous ulcers. Search methods The searches for the review were first undertaken in 2000. For this update we searched the Cochrane Wounds Group Specialised Register (October 2007), The Cochrane Central Register of Controlled Trials (CENTRAL) - The Cochrane Library 2007 Issue 3, Ovid MEDLINE - 1950 to September Week 4 2007, Ovid EMBASE - 1980 to 2007 Week 40 and Ovid CINAHL - 1982 to October Week 1 2007. Selection criteria Randomised controlled trials evaluating compression bandages or hosiery for preventing venous leg ulcers. Data collection and analysis Data extraction and assessment of study quality were undertaken by two authors independently. Results No trials compared recurrence rates with and without compression. One trial (300 patients) compared high (UK Class 3) compression hosiery with moderate (UK Class 2) compression hosiery. A intention to treat analysis found no significant reduction in recurrence at five years follow up associated with high compression hosiery compared with moderate compression hosiery (relative risk of recurrence 0.82, 95% confidence interval 0.61 to 1.12). This analysis would tend to underestimate the effectiveness of the high compression hosiery because a significant proportion of people changed from high compression to medium compression hosiery. Compliance rates were significantly higher with medium compression than with high compression hosiery. One trial (166 patients) found no statistically significant difference in recurrence between two types of medium (UK Class 2) compression hosiery (relative risk of recurrence with Medi was 0.74, 95% confidence interval 0.45 to 1.2). Both trials reported that not wearing compression hosiery was strongly associated with ulcer recurrence and this is circumstantial evidence that compression reduces ulcer recurrence. No trials were found which evaluated compression bandages for preventing ulcer recurrence. Authors' conclusions No trials compared compression with vs no compression for prevention of ulcer recurrence. Not wearing compression was associated with recurrence in both studies identified in this review. This is circumstantial evidence of the benefit of compression in reducing recurrence. Recurrence rates may be lower in high compression hosiery than in medium compression hosiery and therefore patients should be offered the strongest compression with which they can comply. Further trials are needed to determine the effectiveness of hosiery prescribed in other settings, i.e. in the UK community, in countries other than the UK.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traditional Birth Attendants (TBA) training has been an important component of public health policy interventions to improve maternal and child health in developing countries since the 1970s. More recently, since the 1990s, the TBA training strategy has been increasingly seen as irrelevant, ineffective or, on the whole, a failure due to evidence that the maternal mortality rate (MMR) in developing countries had not reduced. Although, worldwide data show that, by choice or out of necessity, 47 percent of births in the developing world are assisted by TBAs and/or family members, funding for TBA training has been reduced and moved to providing skilled birth attendants for all births. Any shift in policy needs to be supported by appropriate evidence on TBA roles in providing maternal and infant health care service and effectiveness of the training programmes. This article reviews literature on the characteristics and role of TBAs in South Asia with an emphasis on India. The aim was to assess the contribution of TBAs in providing maternal and infant health care service at different stages of pregnancy and after-delivery and birthing practices adopted in home births. The review of role revealed that apart from TBAs, there are various other people in the community also involved in making decisions about the welfare and health of the birthing mother and new born baby. However, TBAs have changing, localised but nonetheless significant roles in delivery, postnatal and infant care in India. Certain traditional birthing practices such as bathing babies immediately after birth, not weighing babies after birth and not feeding with colostrum are adopted in home births as well as health institutions in India. There is therefore a thin precarious balance between the application of biomedical and traditional knowledge. Customary rituals and perceptions essentially affect practices in home and institutional births and hence training of TBAs need to be implemented in conjunction with community awareness programmes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is the result of an investigation of a Queensland example of curriculum reform based on outcomes, a type of reform common to many parts of the world during the last decade. The purpose of the investigation was to determine the impact of outcomes on teacher perspectives of professional practice. The focus was chosen to permit investigation not only of changes in behaviour resulting from the reform but also of teachers' attitudes and beliefs developed during implementation. The study is based on qualitative methodology, chosen because of its suitability for the investigation of attitudes and perspectives. The study exploits the researcher's opportunities for prolonged, direct contact with groups of teachers through the selection of an over-arching ethnography approach, an approach designed to capture the holistic nature of the reform and to contextualise the data within a broad perspective. The selection of grounded theory as a basis for data analysis reflects the open nature of this inquiry and demonstrates the study's constructivist assumptions about the production of knowledge. The study also constitutes a multi-site case study by virtue of the choice of three individual school sites as objects to be studied and to form the basis of the report. Three primary school sites administered by Brisbane Catholic Education were chosen as the focus of data collection. Data were collected from three school sites as teachers engaged in the first year of implementation of Student Performance Standards, the Queensland version of English outcomes based on the current English syllabus. Teachers' experience of outcomes-driven curriculum reform was studied by means of group interviews conducted at individual school sites over a period of fourteen months, researcher observations and the collection of artefacts such as report cards. Analysis of data followed grounded theory guidelines based on a system of coding. Though classification systems were not generated prior to data analysis, the labelling of categories called on standard, non-idiosyncratic terminology and analytic frames and concepts from existing literature wherever practicable in order to permit possible comparisons with other related research. Data from school sites were examined individually and then combined to determine teacher understandings of the reform, changes that have been made to practice and teacher responses to these changes in terms of their perspectives of professionalism. Teachers in the study understood the reform as primarily an accountability mechanism. Though teachers demonstrated some acceptance of the intentions of the reform, their responses to its conceptualisation, supporting documentation and implications for changing work practices were generally characterised by reduced confidence, anger and frustration. Though the impact of outcomes-based curriculum reform must be interpreted through the inter-relationships of a broad range of elements which comprise teachers' work and their attitudes towards their work, it is proposed that the substantive findings of the study can be understood in terms of four broad themes. First, when the conceptual design of outcomes did not serve teachers' accountability requirements and outcomes were perceived to be expressed in unfamiliar technical language, most teachers in the study lost faith in the value of the reform and lost confidence in their own abilities to understand or implement it. Second, this reduction of confidence was intensified when the scope of outcomes was outside the scope of the teachers' existing curriculum and assessment planning and teachers were confronted with the necessity to include aspects of syllabuses or school programs which they had previously omitted because of a lack of understanding or appreciation. The corollary was that outcomes promoted greater syllabus fidelity when frameworks were closely aligned. Third, other benefits the teachers associated with outcomes included the development of whole school curriculum resources and greater opportunity for teacher collaboration, particularly among schools. The teachers, however, considered a wide range of factors when determining the overall impact of the reform, and perceived a number of them in terms of the costs of implementation. These included the emergence of ethical dilemmas concerning relationships with students, colleagues and parents, reduced individual autonomy, particularly with regard to the selection of valued curriculum content and intensification of workload with the capacity to erode the relationships with students which teachers strongly associated with the rewards of their profession. Finally, in banding together at the school level to resist aspects of implementation, some teachers showed growing awareness of a collective authority capable of being exercised in response to top-down reform. These findings imply that Student Performance Standards require review and, additional implementation resourcing to support teachers through times of reduced confidence in their own abilities. Outcomes prove an effective means of high-fidelity syllabus implementation, and, provided they are expressed in an accessible way and aligned with syllabus frameworks and terminology, should be considered for inclusion in future syllabuses across a range of learning areas. The study also identifies a range of unintended consequences of outcomes-based curriculum and acknowledges the complexity of relationships among all the aspects of teachers' work. It also notes that the impact of reform on teacher perspectives of professional practice may alter teacher-teacher and school-system relationships in ways that have the potential to influence the effectiveness of future curriculum reform.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to estimate the safety impact of roadway interventions engineers need to collect, analyze, and interpret the results of carefully implemented data collection efforts. The intent of these studies is to develop Accident Modification Factors (AMF's), which are used to predict the safety impact of various road safety features at other locations or in upon future enhancements. Models are typically estimated to estimate AMF's for total crashes, but can and should be estimated for crash outcomes as well. This paper first describes data collected with the intent estimate AMF's for rural intersections in the state of Georgia within the United Sates. Modeling results of crash prediction models for the crash outcomes: angle, head-on, rear-end, sideswipe (same direction and opposite direction) and pedestrian-involved crashes are then presented and discussed. The analysis reveals that factors such as the Annual Average Daily Traffic (AADT), the presence of turning lanes, and the number of driveways have a positive association with each type of crash, while the median width and the presence of lighting are negatively associated with crashes. The model covariates are related to crash outcome in different ways, suggesting that crash outcomes are associated with different pre-crash conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many studies focused on the development of crash prediction models have resulted in aggregate crash prediction models to quantify the safety effects of geometric, traffic, and environmental factors on the expected number of total, fatal, injury, and/or property damage crashes at specific locations. Crash prediction models focused on predicting different crash types, however, have rarely been developed. Crash type models are useful for at least three reasons. The first is motivated by the need to identify sites that are high risk with respect to specific crash types but that may not be revealed through crash totals. Second, countermeasures are likely to affect only a subset of all crashes—usually called target crashes—and so examination of crash types will lead to improved ability to identify effective countermeasures. Finally, there is a priori reason to believe that different crash types (e.g., rear-end, angle, etc.) are associated with road geometry, the environment, and traffic variables in different ways and as a result justify the estimation of individual predictive models. The objectives of this paper are to (1) demonstrate that different crash types are associated to predictor variables in different ways (as theorized) and (2) show that estimation of crash type models may lead to greater insights regarding crash occurrence and countermeasure effectiveness. This paper first describes the estimation results of crash prediction models for angle, head-on, rear-end, sideswipe (same direction and opposite direction), and pedestrian-involved crash types. Serving as a basis for comparison, a crash prediction model is estimated for total crashes. Based on 837 motor vehicle crashes collected on two-lane rural intersections in the state of Georgia, six prediction models are estimated resulting in two Poisson (P) models and four NB (NB) models. The analysis reveals that factors such as the annual average daily traffic, the presence of turning lanes, and the number of driveways have a positive association with each type of crash, whereas median widths and the presence of lighting are negatively associated. For the best fitting models covariates are related to crash types in different ways, suggesting that crash types are associated with different precrash conditions and that modeling total crash frequency may not be helpful for identifying specific countermeasures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Emergency departments (EDs) are often the first point of contact with an abused child. Despite legal mandate, the reporting of definite or suspected abusive injury to child safety authorities by ED clinicians varies due to a number of factors including training, access to child safety professionals, departmental culture and a fear of ‘getting it wrong’. This study examined the quality of documentation and coding of child abuse captured by ED based injury surveillance data and ED medical records in the state of Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of almost 1000 injured children included in the Queensland Injury Surveillance Unit database (QISU) from 10 hospitals in urban and rural centres. Independent experts re-coded the records based on their review of the notes. A data linkage methodology was then used to link these records with records in the state government’s child welfare database. Cases were sampled from three sub-groups according to the surveillance intent codes: Maltreatment by parent, Undetermined and Unintentional injury. Only 0.1% of cases coded as unintentional injury were recoded to maltreatment by parent, while 1.2% of cases coded as maltreatment by parent were reclassified as unintentional and 5% of cases where the intent was undetermined by the triage nurse were recoded as maltreatment by parent. Quality of documentation varied across type of hospital (tertiary referral centre, children’s, urban, regional and remote). Concordance of health data with child welfare data varied across patient subgroups. Outcomes from this research will guide initiatives to improve the quality of intentional child injury surveillance systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The detached housing scheme is a unique and exclusive segment of the residential property market in Malaysia. Generally, the product is expensive and for many Malaysians who can afford them, owning a detached house is a once in a lifetime opportunity. In spite of this, most of the owners fail to fully comprehend the specific need of this type of housing scheme, increasing the risk of it being a problematic project. Unlike other types of pre-designed ‘mass housing’ schemes, the detached housing scheme may be built specifically to cater the needs and demands of its owner. Therefore, maximum owner participation is vital as the development progresses to guarantee the success of the project. In addition, due to it’s unique design the house would have to individually comply with the requirements and regulations of relevant authorities. Failure of owner to recognise this will result in delays, fines and penalties, disputes and ultimately cost overruns. These circumstances highlight the need for a model to guide the owner through the entire development process of a detached house. Therefore, this research aims to develop a model for a successful detached housing development in Malaysia through maximising owner participation during it’s various development stages. To achieve this, questionnaire surveys and case studies methods shall be employed to acquire the detached housing owners’ experiences in developing their detached houses in Malaysia. Relevant statistical tools shall be applied to analyse the responses. The results gained from this study shall be synthesised into a model of successful detached housing development for the reference of future detached housing owners in Malaysia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Type unions, pointer variables and function pointers are a long standing source of subtle security bugs in C program code. Their use can lead to hard-to-diagnose crashes or exploitable vulnerabilities that allow an attacker to attain privileged access over classified data. This paper describes an automatable framework for detecting such weaknesses in C programs statically, where possible, and for generating assertions that will detect them dynamically, in other cases. Exclusively based on analysis of the source code, it identifies required assertions using a type inference system supported by a custom made symbol table. In our preliminary findings, our type system was able to infer the correct type of unions in different scopes, without manual code annotations or rewriting. Whenever an evaluation is not possible or is difficult to resolve, appropriate runtime assertions are formed and inserted into the source code. The approach is demonstrated via a prototype C analysis tool.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The rapid growth of mobile telephone use, satellite services, and now the wireless Internet and WLANs are generating tremendous changes in telecommunication and networking. As indoor wireless communications become more prevalent, modeling indoor radio wave propagation in populated environments is a topic of significant interest. Wireless MIMO communication exploits phenomena such as multipath propagation to increase data throughput and range, or reduce bit error rates, rather than attempting to eliminate effects of multipath propagation as traditional SISO communication systems seek to do. The MIMO approach can yield significant gains for both link and network capacities, with no additional transmitting power or bandwidth consumption when compared to conventional single-array diversity methods. When MIMO and OFDM systems are combined and deployed in a suitable rich scattering environment such as indoors, a significant capacity gain can be observed due to the assurance of multipath propagation. Channel variations can occur as a result of movement of personnel, industrial machinery, vehicles and other equipment moving within the indoor environment. The time-varying effects on the propagation channel in populated indoor environments depend on the different pedestrian traffic conditions and the particular type of environment considered. A systematic measurement campaign to study pedestrian movement effects in indoor MIMO-OFDM channels has not yet been fully undertaken. Measuring channel variations caused by the relative positioning of pedestrians is essential in the study of indoor MIMO-OFDM broadband wireless networks. Theoretically, due to high multipath scattering, an increase in MIMO-OFDM channel capacity is expected when pedestrians are present. However, measurements indicate that some reductions in channel capacity could be observed as the number of pedestrians approaches 10 due to a reduction in multipath conditions as more human bodies absorb the wireless signals. This dissertation presents a systematic characterization of the effects of pedestrians in indoor MIMO-OFDM channels. Measurement results, using the MIMO-OFDM channel sounder developed at the CSIRO ICT Centre, have been validated by a customized Geometric Optics-based ray tracing simulation. Based on measured and simulated MIMO-OFDM channel capacity and MIMO-OFDM capacity dynamic range, an improved deterministic model for MIMO-OFDM channels in indoor populated environments is presented. The model can be used for the design and analysis of future WLAN to be deployed in indoor environments. The results obtained show that, in both Fixed SNR and Fixed Tx for deterministic condition, the channel capacity dynamic range rose with the number of pedestrians as well as with the number of antenna combinations. In random scenarios with 10 pedestrians, an increment in channel capacity of up to 0.89 bits/sec/Hz in Fixed SNR and up to 1.52 bits/sec/Hz in Fixed Tx has been recorded compared to the one pedestrian scenario. In addition, from the results a maximum increase in average channel capacity of 49% has been measured while 4 antenna elements are used, compared with 2 antenna elements. The highest measured average capacity, 11.75 bits/sec/Hz, corresponds to the 4x4 array with 10 pedestrians moving randomly. Moreover, Additionally, the spread between the highest and lowest value of the the dynamic range is larger for Fixed Tx, predicted 5.5 bits/sec/Hz and measured 1.5 bits/sec/Hz, in comparison with Fixed SNR criteria, predicted 1.5 bits/sec/Hz and measured 0.7 bits/sec/Hz. This has been confirmed by both measurements and simulations ranging from 1 to 5, 7 and 10 pedestrians.