922 resultados para Traditional enrichment method
Resumo:
The vibration serviceability limit state is an important design consideration for two-way, suspended concrete floors that is not always well understood by many practicing structural engineers. Although the field of floor vibration has been extensively developed, at present there are no convenient design tools that deal with this problem. Results from this research have enabled the development of a much-needed, new method for assessing the vibration serviceability of flat, suspended concrete floors in buildings. This new method has been named, the Response Coefficient-Root Function (RCRF) method. Full-scale, laboratory tests have been conducted on a post-tensioned floor specimen at Queensland University of Technology’s structural laboratory. Special support brackets were fabricated to perform as frictionless, pinned connections at the corners of the specimen. A series of static and dynamic tests were performed in the laboratory to obtain basic material and dynamic properties of the specimen. Finite-element-models have been calibrated against data collected from laboratory experiments. Computational finite-element-analysis has been extended to investigate a variety of floor configurations. Field measurements of floors in existing buildings are in good agreement with computational studies. Results from this parametric investigation have led to the development of new approach for predicting the design frequencies and accelerations of flat, concrete floor structures. The RCRF method is convenient tool to assist structural engineers in the design for the vibration serviceability limit-state of in-situ concrete floor systems.
Resumo:
The traditional Vector Space Model (VSM) is not able to represent both the structure and the content of XML documents. This paper introduces a novel method of representing XML documents in a Tensor Space Model (TSM) and then utilizing it for clustering. Empirical analysis shows that the proposed method is scalable for large-sized datasets; as well, the factorized matrices produced from the proposed method help to improve the quality of clusters through the enriched document representation of both structure and content information.
Resumo:
Background and purpose: The appropriate fixation method for hemiarthroplasty of the hip as it relates to implant survivorship and patient mortality is a matter of ongoing debate. We examined the influence of fixation method on revision rate and mortality.----- ----- Methods: We analyzed approximately 25,000 hemiarthroplasty cases from the AOA National Joint Replacement Registry. Deaths at 1 day, 1 week, 1 month, and 1 year were compared for all patients and among subgroups based on implant type.----- ----- Results: Patients treated with cemented monoblock hemiarthroplasty had a 1.7-times higher day-1 mortality compared to uncemented monoblock components (p < 0.001). This finding was reversed by 1 week, 1 month, and 1 year after surgery (p < 0.001). Modular hemiarthroplasties did not reveal a difference in mortality between fixation methods at any time point.----- ----- Interpretation: This study shows lower (or similar) overall mortality with cemented hemiarthroplasty of the hip.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
This research underlines the extensive application of nanostructured metal oxides in environmental systems such as hazardous waste remediation and water purification. This study tries to forge a new understanding of the complexity of adsorption and photocatalysis in the process of water treatment. Sodium niobate doped with a different amount of tantalum, was prepared via a hydrothermal reaction and was observed to be able to adsorb highly hazardous bivalent radioactive isotopes such as Sr2+ and Ra2+ions. This study facilitates the preparation of Nb-based adsorbents for efficiently removing toxic radioactive ions from contaminated water and also identifies the importance of understanding the influence of heterovalent substitution in microporous frameworks. Clay adsorbents were prepared via a two-step method to remove anionic and non-ionic herbicides from water. Firstly, layered beidellite clay was treated with acid in a hydrothermal process; secondly, common silane coupling agents, 3-chloro-propyl trimethoxysilane or triethoxy silane, were grafted onto the acid treated samples to prepare the adsorption materials. In order to isolate the effect of the clay surface, we compared the adsorption property of clay adsorbents with ƒ×-Al2O3 nanofibres grafted with the same functional groups. Thin alumina (£^-Al2O3) nanofibres were modified by the grafting of two organosilane agents 3-chloropropyltriethoxysilane and octyl triethoxysilane onto the surface, for the adsorptive removal of alachlor and imazaquin herbicides from water. The formation of organic groups during the functionalisation process established super hydrophobic sites along the surfaces and those non-polar regions of the surfaces were able to make close contact with the organic pollutants. A new structure of anatase crystals linked to clay fragments was synthesised by the reaction of TiOSO4 with laponite clay for the degradation of pesticides. Based on the Ti/clay ratio, these new catalysts showed a high degradation rate when compared with P25. Moreover, immobilized TiO2 on laponite clay fragments could be readily separated out from a slurry system after the photocatalytic reaction. Using a series of partial phase transition methods, an effective catalyst with fibril morphology was prepared for the degradation of different types of phenols and trace amount of herbicides from water. Both H-titanate and TiO2-(B) fibres coated with anatase nanocrystal were studied. When compared with a laponite clay photocatalyst, it was found that anatase dotted TiO2-(B) fibres prepared by a 45 h hydrothermal treatment followed by calcination were not only superior in performance in photocatalysis but could also be readily separated from a slurry system after photocatalytic reactions. This study has laid the foundation for the development of the ability to fabricate highly efficient nanostructured solids for the removal of radioactive ions and organic pollutants from contaminated water. These results now seem set to contribute to the development of advanced water purification devices in the future. These modified nanostructured materials with unusual properties have broadened their application range beyond their traditional use as adsorbents, to also encompass the storage of nuclear waste after concentrating from contaminated water.
Resumo:
Hydrogels, which are three-dimensional crosslinked hydrophilic polymers, have been used and studied widely as vehicles for drug delivery due to their good biocompatibility. Traditional methods to load therapeutic proteins into hydrogels have some disadvantages. Biological activity of drugs or proteins can be compromised during polymerization process or the process of loading protein can be really timeconsuming. Therefore, different loading methods have been investigated. Based on the theory of electrophoresis, an electrochemical gradient can be used to transport proteins into hydrogels. Therefore, an electrophoretic method was used to load protein in this study. Chemically and radiation crosslinked polyacrylamide was used to set up the model to load protein electrophoretically into hydrogels. Different methods to prepare the polymers have been studied and have shown the effect of the crosslinker (bisacrylamide) concentration on the protein loading and release behaviour. The mechanism of protein release from the hydrogels was anomalous diffusion (i.e. the process was non-Fickian). The UV-Vis spectra of proteins before and after reduction show that the bioactivities of proteins after release from hydrogel were maintained. Due to the concern of cytotoxicity of residual monomer in polyacrylamide, poly(2-hydroxyethyl- methacrylate) (pHEMA) was used as the second tested material. In order to control the pore size, a polyethylene glycol (PEG) porogen was introduced to the pHEMA. The hydrogel disintegrated after immersion in water indicating that the swelling forces exceeded the strength of the material. In order to understand the cause of the disintegration, several different conditions of crosslinker concentration and preparation method were studied. However, the disintegration of the hydrogel still occurred after immersion in water principally due to osmotic forces. A hydrogel suitable for drug delivery needs to be biocompatible and also robust. Therefore, an approach to improving the mechanical properties of the porogen-containing pHEMA hydrogel by introduction of an inter-penetrating network (IPN) into the hydrogel system has been researched. A double network was formed by the introduction of further HEMA solution into the system by both electrophoresis and slow diffusion. Raman spectroscopy was used to observe the diffusion of HEMA into the hydrogel prior to further crosslinking by ã-irradiation. The protein loading and release behaviour from the hydrogel showing enhanced mechanical property was also studied. Biocompatibility is a very important factor for the biomedical application of hydrogels. Different hydrogels have been studied on both a three-dimensional HSE model and a HSE wound model for their biocompatibilities. They did not show any detrimental effect to the keratinocyte cells. From the results reported above, these hydrogels show good biocompatibility in both models. Due to the advantage of the hydrogels such as the ability to absorb and deliver protein or drugs, they have potential to be used as topical materials for wound healing or other biomedical applications.
Resumo:
Human hair fibres are ubiquitous in nature and are found frequently at crime scenes often as a result of exchange between the perpetrator, victim and/or the surroundings according to Locard's Principle. Therefore, hair fibre evidence can provide important information for crime investigation. For human hair evidence, the current forensic methods of analysis rely on comparisons of either hair morphology by microscopic examination or nuclear and mitochondrial DNA analyses. Unfortunately in some instances the utilisation of microscopy and DNA analyses are difficult and often not feasible. This dissertation is arguably the first comprehensive investigation aimed to compare, classify and identify the single human scalp hair fibres with the aid of FTIR-ATR spectroscopy in a forensic context. Spectra were collected from the hair of 66 subjects of Asian, Caucasian and African (i.e. African-type). The fibres ranged from untreated to variously mildly and heavily cosmetically treated hairs. The collected spectra reflected the physical and chemical nature of a hair from the near-surface particularly, the cuticle layer. In total, 550 spectra were acquired and processed to construct a relatively large database. To assist with the interpretation of the complex spectra from various types of human hair, Derivative Spectroscopy and Chemometric methods such as Principal Component Analysis (PCA), Fuzzy Clustering (FC) and Multi-Criteria Decision Making (MCDM) program; Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Geometrical Analysis for Interactive Aid (GAIA); were utilised. FTIR-ATR spectroscopy had two important advantages over to previous methods: (i) sample throughput and spectral collection were significantly improved (no physical flattening or microscope manipulations), and (ii) given the recent advances in FTIR-ATR instrument portability, there is real potential to transfer this work.s findings seamlessly to on-field applications. The "raw" spectra, spectral subtractions and second derivative spectra were compared to demonstrate the subtle differences in human hair. SEM images were used as corroborative evidence to demonstrate the surface topography of hair. It indicated that the condition of the cuticle surface could be of three types: untreated, mildly treated and treated hair. Extensive studies of potential spectral band regions responsible for matching and discrimination of various types of hair samples suggested the 1690-1500 cm-1 IR spectral region was to be preferred in comparison with the commonly used 1750-800 cm-1. The principal reason was the presence of the highly variable spectral profiles of cystine oxidation products (1200-1000 cm-1), which contributed significantly to spectral scatter and hence, poor hair sample matching. In the preferred 1690-1500 cm-1 region, conformational changes in the keratin protein attributed to the α-helical to β-sheet transitions in the Amide I and Amide II vibrations and played a significant role in matching and discrimination of the spectra and hence, the hair fibre samples. For gender comparison, the Amide II band is significant for differentiation. The results illustrated that the male hair spectra exhibit a more intense β-sheet vibration in the Amide II band at approximately 1511 cm-1 whilst the female hair spectra displayed more intense α-helical vibration at 1520-1515cm-1. In terms of chemical composition, female hair spectra exhibit greater intensity of the amino acid tryptophan (1554 cm-1), aspartic and glutamic acid (1577 cm-1). It was also observed that for the separation of samples based on racial differences, untreated Caucasian hair was discriminated from Asian hair as a result of having higher levels of the amino acid cystine and cysteic acid. However, when mildly or chemically treated, Asian and Caucasian hair fibres are similar, whereas African-type hair fibres are different. In terms of the investigation's novel contribution to the field of forensic science, it has allowed for the development of a novel, multifaceted, methodical protocol where previously none had existed. The protocol is a systematic method to rapidly investigate unknown or questioned single human hair FTIR-ATR spectra from different genders and racial origin, including fibres of different cosmetic treatments. Unknown or questioned spectra are first separated on the basis of chemical treatment i.e. untreated, mildly treated or chemically treated, genders, and racial origin i.e. Asian, Caucasian and African-type. The methodology has the potential to complement the current forensic analysis methods of fibre evidence (i.e. Microscopy and DNA), providing information on the morphological, genetic and structural levels.
Resumo:
While the importance of literature studies in the IS discipline is well recognized, little attention has been paid to the underlying structure and method of conducting effective literature reviews. Despite the fact that literature is often used to refine the research context and direct the pathways for successful research outcomes, there is very little evidence of the use of resource management tools to support the literature review process. In this paper we want to contribute to advancing the way in which literature studies in Information Systems are conducted, by proposing a systematic, pre-defined and tool-supported method to extract, analyse and report literature. This paper presents how to best identify relevant IS papers to review within a feasible and justifiable scope, how to extract relevant content from identified papers, how to synthesise and analyse the findings of a literature review and what are ways to effectively write and present the results of a literature review. The paper is specifically targeted towards novice IS researchers, who would seek to conduct a systematic detailed literature review in a focused domain. Specific contributions of our method are extensive tool support, the identification of appropriate papers including primary and secondary paper sets and a pre-codification scheme. We use a literature study on shared services as an illustrative example to present the proposed approach.
Researching employment relations : a self-reflexive analysis of a multi-method, school-based project
Resumo:
Drawing on primary data and adjunct material, this article adopts a critical self-reflexive approach to a three-year, Australian Research Council-funded projectthat explored themes around 'employment citizenship'for high school students in Queensland. The article addresses three overlapping areas that reflect some of the central dilemmas and challenges arising through the project- consent in the context of research ethics, questionnaire administration in schools, and focus group research practice. It contributes to the broader methodological literature addressing research with young people by canvassing pragmatic suggestions for future school-based research, and research addressing adolescent employment.
Resumo:
Background The vast sequence divergence among different virus groups has presented a great challenge to alignment-based analysis of virus phylogeny. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignment could not be directly applied to the whole-genome comparison and phylogenomic studies of viruses. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. Among the alignment-free methods, a dynamical language (DL) method proposed by our group has successfully been applied to the phylogenetic analysis of bacteria and chloroplast genomes. Results In this paper, the DL method is used to analyze the whole-proteome phylogeny of 124 large dsDNA viruses and 30 parvoviruses, two data sets with large difference in genome size. The trees from our analyses are in good agreement to the latest classification of large dsDNA viruses and parvoviruses by the International Committee on Taxonomy of Viruses (ICTV). Conclusions The present method provides a new way for recovering the phylogeny of large dsDNA viruses and parvoviruses, and also some insights on the affiliation of a number of unclassified viruses. In comparison, some alignment-free methods such as the CV Tree method can be used for recovering the phylogeny of large dsDNA viruses, but they are not suitable for resolving the phylogeny of parvoviruses with a much smaller genome size.