997 resultados para Analysis of binaries
Resumo:
Wireless network technologies, such as IEEE 802.11 based wireless local area networks (WLANs), have been adopted in wireless networked control systems (WNCS) for real-time applications. Distributed real-time control requires satisfaction of (soft) real-time performance from the underlying networks for delivery of real-time traffic. However, IEEE 802.11 networks are not designed for WNCS applications. They neither inherently provide quality-of-service (QoS) support, nor explicitly consider the characteristics of the real-time traffic on networked control systems (NCS), i.e., periodic round-trip traffic. Therefore, the adoption of 802.11 networks in real-time WNCSs causes challenging problems for network design and performance analysis. Theoretical methodologies are yet to be developed for computing the best achievable WNCS network performance under the constraints of real-time control requirements. Focusing on IEEE 802.11 distributed coordination function (DCF) based WNCSs, this paper analyses several important NCS network performance indices, such as throughput capacity, round trip time and packet loss ratio under the periodic round trip traffic pattern, a unique feature of typical NCSs. Considering periodic round trip traffic, an analytical model based on Markov chain theory is developed for deriving these performance indices under a critical real-time traffic condition, at which the real-time performance constraints are marginally satisfied. Case studies are also carried out to validate the theoretical development.
Resumo:
Burkholderia pseudomallei, the causative agent of melioidosis is associated with soil. This study used a geographic information system (GIS) to determine the spatial distribution of clinical cases of melioidosis in the endemic suburban region of Townsville in Australia. A total of 65 cases over the period 1996–2008 were plotted using residential address. Two distinct groupings were found. One was around the base of a hill in the city centre and the other followed the old course of a major waterway in the region. Both groups (accounting for 43 of the 65 cases examined) are in areas expected to have particularly wet topsoils following intense rainfall, due to soil type or landscape position.
Resumo:
Purpose: To undertake rigorous psychometric testing of the newly developed contemporary work environment measure (the Brisbane Practice Environment Measure [B-PEM]) using exploratory factor analysis and confirmatory factor analysis. Methods: Content validity of the 33-item measure was established by a panel of experts. Initial testing involved 195 nursing staff using principal component factor analysis with varimax rotation (orthogonal) and Cronbach's alpha coefficients. Confirmatory factor analysis was conducted using data from a further 983 nursing staff. Results: Principal component factor analysis yielded a four-factor solution with eigenvalues greater than 1 that explained 52.53% of the variance. These factors were then verified using confirmatory factor analysis. Goodness-of-fit indices showed an acceptable fit overall with the full model, explaining 21% to 73% of the variance. Deletion of items took place throughout the evolution of the instrument, resulting in a 26-item, four-factor measure called the Brisbane Practice Environment Measure-Tested. Conclusions: The B-PEM has undergone rigorous psychometric testing, providing evidence of internal consistency and goodness-of-fit indices within acceptable ranges. The measure can be utilised as a subscale or total score reflective of a contemporary nursing work environment. Clinical Relevance: An up-to-date instrument to measure practice environment may be useful for nursing leaders to monitor the workplace and to assist in identifying areas for improvement, facilitating greater job satisfaction and retention.
Resumo:
Zero energy buildings (ZEB) and zero energy homes (ZEH) are a current hot topic globally for policy makers (what are the benefits and costs), designers (how do we design them), the construction industry (can we build them), marketing (will consumers buy them) and researchers (do they work and what are the implications). This paper presents initial findings from actual measured data from a 9 star (as built), off-ground detached family home constructed in south-east Queensland in 2008. The integrated systems approach to the design of the house is analysed in each of its three main goals: maximising the thermal performance of the building envelope, minimising energy demand whilst maintaining energy service levels, and implementing a multi-pronged low carbon approach to energy supply. The performance outcomes of each of these stages are evaluated against definitions of Net Zero Carbon / Net Zero Emissions (Site and Source) and Net Zero Energy (onsite generation v primary energy imports). The paper will conclude with a summary of the multiple benefits of combining very high efficiency building envelopes with diverse energy management strategies: a robustness, resilience, affordability and autonomy not generally seen in housing.
Resumo:
Most information retrieval (IR) models treat the presence of a term within a document as an indication that the document is somehow "about" that term, they do not take into account when a term might be explicitly negated. Medical data, by its nature, contains a high frequency of negated terms - e.g. "review of systems showed no chest pain or shortness of breath". This papers presents a study of the effects of negation on information retrieval. We present a number of experiments to determine whether negation has a significant negative affect on IR performance and whether language models that take negation into account might improve performance. We use a collection of real medical records as our test corpus. Our findings are that negation has some affect on system performance, but this will likely be confined to domains such as medical data where negation is prevalent.
Resumo:
Before 2001, most Africans immigrating to Australia were white South Africans and Zimbabweans who arrived as economic and family-reunion migrants (Cox, Cooper & Adepoju, 1999). Black African communities are a more recent addition to the Australian landscape, with most entering Australia as refugees after 2001. African refugees are a particularly disadvantaged immigrant group, which the Department of Immigration and Multicultural Affairs (in the Community Relations Commission of New South Wales, 2006) suggests require high levels of settlement support (p.23). Decision makers and settlement service providers need to have settlement data on the communities so that they can be effective in planning, budgeting and delivering support where it is most needed. Settlement data are also useful for determining the challenges that these communities face in trying to establish themselves in resettlement. There has been no verification of existing secondary data sources, however, or previous formal study of African refugee settlement geography in Southeast Queensland. This research addresses the knowledge gap by using a mixed-method approach to identify and describe the distribution and population size of eight African communities in Southeast Queensland, examine secondary migration patterns in these communities and assess the relationship between these geographic features and housing, a critical factor in successful settlement. Significant discrepancies exist between the primary data gathered in the study and existing secondary data relating to population size and distribution of the communities. Results also reveal a tension between the socio-cultural forces and the housing and economic imperatives driving secondary migration in the communities, and a general lack of engagement by African refugees with structured support networks. These findings have a wide range of implications for policy and for groups that provide settlement support to these communities.
Resumo:
Compressed natural gas (CNG) engines are thought to be less harmful to the environment than conventional diesel engines, especially in terms of particle emissions. Although, this is true with respect to particulate matter (PM) emissions, results of particle number (PN) emission comparisons have been inconclusive. In this study, results of on-road and dynamometer studies of buses were used to derive several important conclusions. We show that, although PN emissions from CNG buses are significantly lower than from diesel buses at low engine power, they become comparable at high power. For diesel buses, PN emissions are not significantly different between acceleration and operation at steady maximum power. However, the corresponding PN emissions from CNG buses when accelerating are an order of magnitude greater than when operating at steady maximum power. During acceleration under heavy load, PN emissions from CNG buses are an order of magnitude higher than from diesel buses. The particles emitted from CNG buses are too small to contribute to PM10 emissions or contribute to a reduction of visibility, and may consist of semivolatile nanoparticles.
Resumo:
The dynamics of droplets exhaled from the respiratory system during coughing or talking is addressed. A mathematical model is presented accounting for the motion of a droplet in conjunction with its evaporation. Droplet evaporation and motion are accounted for under two scenarios: 1) A well mixed droplet and 2) A droplet with inner composition variation. A multiple shells model was implemented to account for internal mass and heat transfer and for concentration and temperature gradients inside the droplet. The trajectories of the droplets are computed for a range of conditions and the spatial distribution and residence times of such droplets are evaluated.
Resumo:
This overview focuses on the application of chemometrics techniques for the investigation of soils contaminated by polycyclic aromatic hydrocarbons (PAHs) and metals because these two important and very diverse groups of pollutants are ubiquitous in soils. The salient features of various studies carried out in the micro- and recreational environments of humans, are highlighted in the context of the various multivariate statistical techniques available across discipline boundaries that have been effectively used in soil studies. Particular attention is paid to techniques employed in the geosciences that may be effectively utilized for environmental soil studies; classical multivariate approaches that may be used in isolation or as complementary methods to these are also discussed. Chemometrics techniques widely applied in atmospheric studies for identifying sources of pollutants or for determining the importance of contaminant source contributions to a particular site, have seen little use in soil studies, but may be effectively employed in such investigations. Suitable programs are also available for suggesting mitigating measures in cases of soil contamination, and these are also considered. Specific techniques reviewed include pattern recognition techniques such as Principal Components Analysis (PCA), Fuzzy Clustering (FC) and Cluster Analysis (CA); geostatistical tools include variograms, Geographical Information Systems (GIS), contour mapping and kriging; source identification and contribution estimation methods reviewed include Positive Matrix Factorisation (PMF), and Principal Component Analysis on Absolute Principal Component Scores (PCA/APCS). Mitigating measures to limit or eliminate pollutant sources may be suggested through the use of ranking analysis and multi criteria decision making methods (MCDM). These methods are mainly represented in this review by studies employing the Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and its associated graphic output, Geometrical Analysis for Interactive Aid (GAIA).
Resumo:
Enterprise Architectures have emerged as comprehensive corporate artefacts that provide structure to the plethora of conceptual views on an enterprise. The recent popularity of a service-oriented design of organizations has added service and related constructs as a new element that requires consideration within an Enterprise Architecture. This paper analyzes and compares the existing proposals for how to best integrate services into Enterprise Architectures. It uses the popular Zachman Framework as an example and differentiates the existing integration alternatives. This research can be generalized beyond service integration into an investigation onto how to possibly extend Enterprise Architectures with emerging constructs.
Resumo:
This work examines the algebraic cryptanalysis of small scale variants of the LEX-BES. LEX-BES is a stream cipher based on the Advanced Encryption Standard (AES) block cipher. LEX is a generic method proposed for constructing a stream cipher from a block cipher, initially introduced by Biryukov at eSTREAM, the ECRYPT Stream Cipher project in 2005. The Big Encryption System (BES) is a block cipher introduced at CRYPTO 2002 which facilitates the algebraic analysis of the AES block cipher. In this article, experiments were conducted to find solutions of equation systems describing small scale LEX-BES using Gröbner Basis computations. This follows a similar approach to the work by Cid, Murphy and Robshaw at FSE 2005 that investigated algebraic cryptanalysis on small scale variants of the BES. The difference between LEX-BES and BES is that due to the way the keystream is extracted, the number of unknowns in LEX-BES equations is fewer than the number in BES. As far as the authors know, this attempt is the first at creating solvable equation systems for stream ciphers based on the LEX method using Gröbner Basis computations.
Resumo:
The use of porous structures as tissue engineering scaffolds imposes demands on structural parameters such as porosity, pore size and interconnectivity. For the structural analysis of porous scaffolds, micro-computed tomography (μCT) is an ideal tool. μCT is a 3D X-ray imaging method that has several advantages over scanning electron microscopy (SEM) and other conventional characterisation techniques: • visualisation in 3D • quantitative results • non-destructiveness • minimal sample preparation
Resumo:
A novel method was developed for a quantitative assessment of pore interconnectivity using micro-CT data. This method makes use of simulated spherical particles, percolating through the interconnected pore network. For each sphere diameter, the accessible pore volume is calculated. This algorithm was applied to compare pore interconnectivity of two different scaffold architectures; one created by salt-leaching and the other by stereolithography. The algorithm revealed a much higher pore interconnectivity for the latter one.
Resumo:
Myosin is believed to act as the molecular motor for many actin-based motility processes in eukaryotes. It is becoming apparent that a single species may possess multiple myosin isoforms, and at least seven distinct classes of myosin have been identified from studies of animals, fungi, and protozoans. The complexity of the myosin heavy-chain gene family in higher plants was investigated by isolating and characterizing myosin genomic and cDNA clones from Arabidopsis thaliana. Six myosin-like genes were identified from three polymerase chain reaction (PCR) products (PCR1, PCR11, PCR43) and three cDNA clones (ATM2, MYA2, MYA3). Sequence comparisons of the deduced head domains suggest that these myosins are members of two major classes. Analysis of the overall structure of the ATM2 and MYA2 myosins shows that they are similar to the previously-identified ATM1 and MYA1 myosins, respectively. The MYA3 appears to possess a novel tail domain, with five IQ repeats, a six-member imperfect repeat, and a segment of unique sequence. Northern blot analyses indicate that some of the Arabidopsis myosin genes are preferentially expressed in different plant organs. Combined with previous studies, these results show that the Arabidopsis genome contains at least eight myosin-like genes representing two distinct classes.
Resumo:
The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.