935 resultados para source code querying
Resumo:
Our previous study reported microorganisms in human follicular fluid. The objective of this study was to test human follicular fluid for the presence of microorganisms and to correlate these findings with the in vitro fertilization (IVF) outcomes. In this study, 263 paired follicular fluids and vaginal swabs were collected from women undergoing IVF cycles, with various causes for infertility, and were cultured to detect microorganisms. The cause of infertility and the IVF outcomes for each woman were correlated with the microorganisms detected within follicular fluid collected at the time of trans-vaginal oocyte retrieval. Microorganisms isolated from follicular fluids were classified as: (1) ‘colonizers’ if microorganisms were detected within the follicular fluid, but not within the vaginal swab (at the time of oocyte retrieval); or (2) ‘contaminants’ if microorganisms detected in the vagina at the time of oocyte retrieval were also detected within the follicular fluid. The presence of Lactobacillus spp. in ovarian follicular fluids was associated with embryo maturation and transfer. This study revealed microorganisms in follicular fluid itself and that the presence of particular microorganisms has an adverse affect on IVF outcomes as seen by an overall decrease in embryo transfer rates and pregnancy rates in both fertile and infertile women, and live birth rates in women with idiopathic infertility. Follicular fluid microorganisms are a potential cause of adverse pregnancy outcomes in IVF in both infertile women and in fertile women with infertile male partners.
Resumo:
Enterprise Systems (ES) can be understood as the de facto standard for holistic operational and managerial support within an organization. Most commonly ES are offered as commercial off-the-shelf packages, requiring customization in the user organization. This process is a complex and resource-intensive task, which often prevents small and midsize enterprises (SME) from undertaking configuration projects. Especially in the SME market independent software vendors provide pre-configured ES for a small customer base. The problem of ES configuration is shifted from the customer to the vendor, but remains critical. We argue that the yet unexplored link between process configuration and business document configuration must be closer examined as both types of configuration are closely tied to one another.
Resumo:
In this work, we investigate how hydrogen sensing performance of thermally evaporated MoO3 nanoplatelets can be further improved by RF sputtering a thin layer of tantalum oxide (Ta2O5) or lanthanum oxide (La2O3). We show that dissociated hydrogen atoms cause the thin film layer to be polarised, inducing a measurable potential difference greater than that as reported previously. We attribute these observations to the presence of numerous traps in the thin layer; their states allow a stronger trapping of charge at the Pt-thin film oxide interface as compared to the MoO3 sensors without the coating. Under exposure to H2 (10 000 ppm) the maximum change in dielectric constant of 45.6 (at 260 °C) for the Ta2O5/MoO3 nanoplatelets and 31.6 (at 220 °C) for La2O3/MoO3 nanoplatelets. Subsequently, the maximum sensitivity for the Ta2O5/MoO3 is 16.87 (at 260 °C) and La2O3/MoO3 is 7.52 (at 300 °C).
Resumo:
Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.
Resumo:
In a recent paper, Gordon, Muratov, and Shvartsman studied a partial differential equation (PDE) model describing radially symmetric diffusion and degradation in two and three dimensions. They paid particular attention to the local accumulation time (LAT), also known in the literature as the mean action time, which is a spatially dependent timescale that can be used to provide an estimate of the time required for the transient solution to effectively reach steady state. They presented exact results for three-dimensional applications and gave approximate results for the two-dimensional analogue. Here we make two generalizations of Gordon, Muratov, and Shvartsman’s work: (i) we present an exact expression for the LAT in any dimension and (ii) we present an exact expression for the variance of the distribution. The variance provides useful information regarding the spread about the mean that is not captured by the LAT. We conclude by describing further extensions of the model that were not considered by Gordon,Muratov, and Shvartsman. We have found that exact expressions for the LAT can also be derived for these important extensions...
Resumo:
The electron Volt Spectrometer (eVS) is an inverse geometry filter difference spectrometer that has been optimised to measure the single atom properties of condensed matter systems using a technique known as Neutron Compton Scattering (NCS) or Deep Inelastic Neutron Scattering (DINS). The spectrometer utilises the high flux of epithermal neutrons that are produced by the ISIS neutron spallation source enabling the direct measurement of atomic momentum distributions and ground state kinetic energies. In this paper the procedure that is used to calibrate the spectrometer is described. This includes details of the method used to determine detector positions and neutron flight path lengths as well as the determination of the instrument resolution. Examples of measurements on 3 different samples are shown, ZrH2, 4He and Sn which show the self-consistency of the calibration procedure.
Resumo:
The rapid growth of visual information on Web has led to immense interest in multimedia information retrieval (MIR). While advancement in MIR systems has achieved some success in specific domains, particularly the content-based approaches, general Web users still struggle to find the images they want. Despite the success in content-based object recognition or concept extraction, the major problem in current Web image searching remains in the querying process. Since most online users only express their needs in semantic terms or objects, systems that utilize visual features (e.g., color or texture) to search images create a semantic gap which hinders general users from fully expressing their needs. In addition, query-by-example (QBE) retrieval imposes extra obstacles for exploratory search because users may not always have the representative image at hand or in mind when starting a search (i.e. the page zero problem). As a result, the majority of current online image search engines (e.g., Google, Yahoo, and Flickr) still primarily use textual queries to search. The problem with query-based retrieval systems is that they only capture users’ information need in terms of formal queries;; the implicit and abstract parts of users’ information needs are inevitably overlooked. Hence, users often struggle to formulate queries that best represent their needs, and some compromises have to be made. Studies of Web search logs suggest that multimedia searches are more difficult than textual Web searches, and Web image searching is the most difficult compared to video or audio searches. Hence, online users need to put in more effort when searching multimedia contents, especially for image searches. Most interactions in Web image searching occur during query reformulation. While log analysis provides intriguing views on how the majority of users search, their search needs or motivations are ultimately neglected. User studies on image searching have attempted to understand users’ search contexts in terms of users’ background (e.g., knowledge, profession, motivation for search and task types) and the search outcomes (e.g., use of retrieved images, search performance). However, these studies typically focused on particular domains with a selective group of professional users. General users’ Web image searching contexts and behaviors are little understood although they represent the majority of online image searching activities nowadays. We argue that only by understanding Web image users’ contexts can the current Web search engines further improve their usefulness and provide more efficient searches. In order to understand users’ search contexts, a user study was conducted based on university students’ Web image searching in News, Travel, and commercial Product domains. The three search domains were deliberately chosen to reflect image users’ interests in people, time, event, location, and objects. We investigated participants’ Web image searching behavior, with the focus on query reformulation and search strategies. Participants’ search contexts such as their search background, motivation for search, and search outcomes were gathered by questionnaires. The searching activity was recorded with participants’ think aloud data for analyzing significant search patterns. The relationships between participants’ search contexts and corresponding search strategies were discovered by Grounded Theory approach. Our key findings include the following aspects: - Effects of users' interactive intents on query reformulation patterns and search strategies - Effects of task domain on task specificity and task difficulty, as well as on some specific searching behaviors - Effects of searching experience on result expansion strategies A contextual image searching model was constructed based on these findings. The model helped us understand Web image searching from user perspective, and introduced a context-aware searching paradigm for current retrieval systems. A query recommendation tool was also developed to demonstrate how users’ query reformulation contexts can potentially contribute to more efficient searching.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
Introduction: The use of amorphous-silicon electronic portal imaging devices (a-Si EPIDs) for dosimetry is complicated by the effects of scattered radiation. In photon radiotherapy, primary signal at the detector can be accompanied by photons scattered from linear accelerator components, detector materials, intervening air, treatment room surfaces (floor, walls, etc) and from the patient/phantom being irradiated. Consequently, EPID measurements which presume to take scatter into account are highly sensitive to the identification of these contributions. One example of this susceptibility is the process of calibrating an EPID for use as a gauge of (radiological) thickness, where specific allowance must be made for the effect of phantom-scatter on the intensity of radiation measured through different thicknesses of phantom. This is usually done via a theoretical calculation which assumes that phantom scatter is linearly related to thickness and field-size. We have, however, undertaken a more detailed study of the scattering effects of fields of different dimensions when applied to phantoms of various thicknesses in order to derive scattered-primary ratios (SPRs) directly from simulation results. This allows us to make a more-accurate calibration of the EPID, and to qualify the appositeness of the theoretical SPR calculations. Methods: This study uses a full MC model of the entire linac-phantom-detector system simulated using EGSnrc/BEAMnrc codes. The Elekta linac and EPID are modelled according to specifications from the manufacturer and the intervening phantoms are modelled as rectilinear blocks of water or plastic, with their densities set to a range of physically realistic and unrealistic values. Transmissions through these various phantoms are calculated using the dose detected in the model EPID and used in an evaluation of the field-size-dependence of SPR, in different media, applying a method suggested for experimental systems by Swindell and Evans [1]. These results are compared firstly with SPRs calculated using the theoretical, linear relationship between SPR and irradiated volume, and secondly with SPRs evaluated from our own experimental data. An alternate evaluation of the SPR in each simulated system is also made by modifying the BEAMnrc user code READPHSP, to identify and count those particles in a given plane of the system that have undergone a scattering event. In addition to these simulations, which are designed to closely replicate the experimental setup, we also used MC models to examine the effects of varying the setup in experimentally challenging ways (changing the size of the air gap between the phantom and the EPID, changing the longitudinal position of the EPID itself). Experimental measurements used in this study were made using an Elekta Precise linear accelerator, operating at 6MV, with an Elekta iView GT a-Si EPID. Results and Discussion: 1. Comparison with theory: With the Elekta iView EPID fixed at 160 cm from the photon source, the phantoms, when positioned isocentrically, are located 41 to 55 cm from the surface of the panel. At this geometry, a close but imperfect agreement (differing by up to 5%) can be identified between the results of the simulations and the theoretical calculations. However, this agreement can be totally disrupted by shifting the phantom out of the isocentric position. Evidently, the allowance made for source-phantom-detector geometry by the theoretical expression for SPR is inadequate to describe the effect that phantom proximity can have on measurements made using an (infamously low-energy sensitive) a-Si EPID. 2. Comparison with experiment: For various square field sizes and across the range of phantom thicknesses, there is good agreement between simulation data and experimental measurements of the transmissions and the derived values of the primary intensities. However, the values of SPR obtained through these simulations and measurements seem to be much more sensitive to slight differences between the simulated and real systems, leading to difficulties in producing a simulated system which adequately replicates the experimental data. (For instance, small changes to simulated phantom density make large differences to resulting SPR.) 3. Comparison with direct calculation: By developing a method for directly counting the number scattered particles reaching the detector after passing through the various isocentric phantom thicknesses, we show that the experimental method discussed above is providing a good measure of the actual degree of scattering produced by the phantom. This calculation also permits the analysis of the scattering sources/sinks within the linac and EPID, as well as the phantom and intervening air. Conclusions: This work challenges the assumption that scatter to and within an EPID can be accounted for using a simple, linear model. Simulations discussed here are intended to contribute to a fuller understanding of the contribution of scattered radiation to the EPID images that are used in dosimetry calculations. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital, Brisbane, Australia. The authors are also grateful to Elekta for the provision of manufacturing specifications which permitted the detailed simulation of their linear accelerators and amorphous-silicon electronic portal imaging devices. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.
Resumo:
The electron Volt Spectrometer (eVS) is an inverse geometry filter difference spectrometer that has been optimised to measure the single atom properties of condensed matter systems using a technique known as Neutron Compton Scattering (NCS) or Deep Inelastic Neutron Scattering (DINS). The spectrometer utilises the high flux of epithermal neutrons that are produced by the ISIS neutron spallation source enabling the direct measurement of atomic momentum distributions and ground state kinetic energies. In this paper the procedure that is used to calibrate the spectrometer is described. This includes details of the method used to determine detector positions and neutron flight path lengths as well as the determination of the instrument resolution. Examples of measurements on 3 different samples are shown, ZrH2, 4He and Sn which show the self-consistency of the calibration procedure.
Resumo:
Despite the increasing number of immigrants, there is a limited body of literature describing the use of hospital emergency department (ED) care by immigrants in Australia. This study aims to describe how immigrants from refugee source countries (IRSC) utilise ED care, compared to immigrants from the main English speaking countries (MESC), immigrants from other countries (IOC) and the local population in Queensland. A retrospective analysis of a Queensland state-wide hospital ED dataset (ED Information System) from 1-1-2008 to 31-12-2010 was conducted. Our study showed that immigrants are not a homogenous group. We found that immigrants from IRSC are more likely to use interpreters (8.9%) in the ED compared to IOC. Furthermore, IRSC have a higher rate of ambulance use (odds ratio 1.2, 95% confidence interval (CI) 1.2–1.3), are less likely to be admitted to the hospital from the ED (odds ratio 0.7 (95% CI 0.7–0.8), and have a longer length of stay (LOS; mean differences 33.0, 95% CI 28.8–37.2), in minutes, in the ED compared to the Australian born population. Our findings highlight the need to develop policies and educational interventions to ensure the equitable use of health services among vulnerable immigrant populations.
Resumo:
Objectives: This study examines the accuracy of Gestational Diabetes Mellitus (GDM) case-ascertainment in routinely collected data. Methods: Retrospective cohort study analysed routinely collected data from all births at Cairns Base Hospital, Australia, from 1 January 2004 to 31 December 2010 in the Cairns Base Hospital Clinical Coding system (CBHCC) and the Queensland Perinatal Data Collection (QPDC). GDM case ascertainment in the National Diabetes Services Scheme (NDSS) and Cairns Diabetes Centre (CDC) data were compared. Results: From 2004 to 2010, the specificity of GDM case-ascertainment in the QPDC was 99%. In 2010, only 2 of 225 additional cases were identified from the CDC and CBHCC, suggesting QPDC sensitivity is also over 99%. In comparison, the sensitivity of the CBHCC data was 80% during 2004–2010. The sensitivity of CDC data was 74% in 2010. During 2010, 223 births were coded as GDM in the QPDC, and the NDSS registered 247 women with GDM from the same postcodes, suggesting reasonable uptake on the NDSS register. However, the proportion of Aboriginal and Torres Strait Islander women was lower than expected. Conclusion: The accuracy of GDM case ascertainment in the QPDC appears high, with lower accuracy in routinely collected hospital and local health service data. This limits capacity of local data for planning and evaluation, and developing structured systems to improve post-pregnancy care, and may underestimate resources required. Implications: Data linkage should be considered to improve accuracy of routinely collected local health service data. The accuracy of the NDSS for Aboriginal and Torres Strait Islander women requires further evaluation.
Resumo:
ln 2004 Prahalad made managers aware of the great economic opportunity that the population at the BoP (Base of the Pyramid) could represent for business in the tom of new potential consumers. However, MNCs (Multi-National Corporations) have continued to fail in penetrating low income markets, arguably because applied strategies are often the same adopted at the top of the pyramid. Even in those few cases where products get re-envisioned, theie introduction in contexts of extreme poverty only induces new needs and develops new dependencies. At best the rearrangement of business models by MNCs has meant the realization of CSR (Corporate Social Responsibly) schemes that have validity from a marketing perspective, but still lack the crucial element of social embeddedness (London & Hart, 2004). Today the challenge is lo reach the lowest population tier with reinvented business models based on principles of value co-creation. Starting from a view of the potential consumer at the BoP as a ring of continuity in the value chain process – a resource that can itself produce value - this paper concludes proposing an alternative innovative approach to operate in developing markets that overturns the roles of MNCs and the BoP. The proposed perspective of ‘reversed' source of innovation and primary target market builds on two fundamental tenets: traditional knowledge is rich and greatly unexploded, and markets at the lop of the pyramid are saturated with unnecessary products / practices that have lost contact with the natural environment.
Resumo:
ln 2004 Prahalad made managers aware of the great economic opportunity that the population at the BoP (Base of the Pyramid) could represent for business in the tom of new potential consumers. However, MNCs (Multi-National Corporations) have continued to fail in penetrating low income markets, arguably because applied strategies are often the same adopted at the top of the pyramid. Even in those few cases where products get re-envisioned, theie introduction in contexts of extreme poverty only induces new needs and develops new dependencies. At best the rearrangement of business models by MNCs has meant the realization of CSR (Corporate Social Responsibly) schemes that have validity from a marketing perspective, but still lack the crucial element of social embeddedness (London & Hart, 2004). Today the challenge is lo reach the lowest population tier with reinvented business models based on principles of value co-creation. Starting from a view of the potential consumer at the BoP as a ring of continuity in the value chain process – a resource that can itself produce value - this paper concludes proposing an alternative innovative approach to operate in developing markets that overturns the roles of MNCs and the BoP. The proposed perspective of ‘reversed' source of innovation and primary target market builds on two fundamental tenets: traditional knowledge is rich and greatly unexploded, and markets at the lop of the pyramid are saturated with unnecessary products / practices that have lost contact with the natural environment.
Resumo:
Aerosol mass spectrometers (AMS) are powerful tools in the analysis of the chemical composition of airborne particles, particularly organic aerosols which are gaining increasing attention. However, the advantages of AMS in providing on-line data can be outweighed by the difficulties involved in its use in field measurements at multiple sites. In contrast to the on-line measurement by AMS, a method which involves sample collection on filters followed by subsequent analysis by AMS could significantly broaden the scope of AMS application. We report the application of such an approach to field studies at multiple sites. An AMS was deployed at 5 urban schools to determine the sources of the organic aerosols at the schools directly. PM1 aerosols were also collected on filters at these and 20 other urban schools. The filters were extracted with water and the extract run through a nebulizer to generate the aerosols, which were analysed by an AMS. The mass spectra from the samples collected on filters at the 5 schools were found to have excellent correlations with those obtained directly by AMS, with r2 ranging from 0.89 to 0.98. Filter recoveries varied between the schools from 40 -115%, possibly indicating that this method provides qualitative rather than quantitative information. The stability of the organic aerosols on Teflon filters was demonstrated by analysing samples stored for up to two years. Application of the procedure to the remaining 20 schools showed that secondary organic aerosols were the main source of aerosols at the majority of the schools. Overall, this procedure provides accurate representation of the mass spectra of ambient organic aerosols and could facilitate rapid data acquisition at multiple sites where AMS could not be deployed for logistical reasons.