181 resultados para investigate significance of mining places


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The XML Document Mining track was launched for exploring two main ideas: (1) identifying key problems and new challenges of the emerging field of mining semi-structured documents, and (2) studying and assessing the potential of Machine Learning (ML) techniques for dealing with generic ML tasks in the structured domain, i.e., classification and clustering of semi-structured documents. This track has run for six editions during INEX 2005, 2006, 2007, 2008, 2009 and 2010. The first five editions have been summarized in previous editions and we focus here on the 2010 edition. INEX 2010 included two tasks in the XML Mining track: (1) unsupervised clustering task and (2) semi-supervised classification task where documents are organized in a graph. The clustering task requires the participants to group the documents into clusters without any knowledge of category labels using an unsupervised learning algorithm. On the other hand, the classification task requires the participants to label the documents in the dataset into known categories using a supervised learning algorithm and a training set. This report gives the details of clustering and classification tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Road crashes cost world and Australian society a significant proportion of GDP, affecting productivity and causing significant suffering for communities and individuals. This paper presents a case study that generates data mining models that contribute to understanding of road crashes by allowing examination of the role of skid resistance (F60) and other road attributes in road crashes. Predictive data mining algorithms, primarily regression trees, were used to produce road segment crash count models from the road and traffic attributes of crash scenarios. The rules derived from the regression trees provide evidence of the significance of road attributes in contributing to crash, with a focus on the evaluation of skid resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, an analysis of the response curve of the vascular endothelial growth factor (VEGF) receptor and its application to cancer therapy was described in [T. Alarcón, and K. Page, J. R. Soc. Lond. Interface 4, 283–304 (2007)]. The analysis is significantly extended here by demonstrating that an alternative computational strategy, namely the Krylov FSP algorithm for the direct solution of the chemical master equation, is feasible for the study of the receptor model. The new method allows us to further investigate the hypothesis of symmetry in the stochastic fluctuations of the response. Also, by augmenting the original model with a single reversible reaction we formulate a plausible mechanism capable of realizing a bimodal response, which is reported experimentally but which is not exhibited by the original model. The significance of these findings for mechanisms of tumour resistance to antiangiogenic therapy is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In topological mapping, perceptual aliasing can cause different places to appear indistinguishable to the robot. In case of severely corrupted or non-available odometry information, topological mapping is difficult as the robot is challenged with the loop-closing problem; that is to determine whether it has visited a particular place before. In this article we propose to use neighbourhood information to disambiguate otherwise indistinguishable places. Using neighbourhood information for place disambiguation is an approach that neither depends on a specific choice of sensors nor requires geometric information such as odometry. Local neighbourhood information is extracted from a sequence of observations of visited places. In experiments using either sonar or visual observations from an indoor environment the benefits of using neighbourhood clues for the disambiguation of otherwise identical vertices are demonstrated. Over 90% of the maps we obtain are isomorphic with the ground truth. The choice of the robot’s sensors does not impact the results of the experiments much.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since users have become the focus of product/service design in last decade, the term User eXperience (UX) has been frequently used in the field of Human-Computer-Interaction (HCI). Research on UX facilitates a better understanding of the various aspects of the user’s interaction with the product or service. Mobile video, as a new and promising service and research field, has attracted great attention. Due to the significance of UX in the success of mobile video (Jordan, 2002), many researchers have centered on this area, examining users’ expectations, motivations, requirements, and usage context. As a result, many influencing factors have been explored (Buchinger, Kriglstein, Brandt & Hlavacs, 2011; Buchinger, Kriglstein & Hlavacs, 2009). However, a general framework for specific mobile video service is lacking for structuring such a great number of factors. To measure user experience of multimedia services such as mobile video, quality of experience (QoE) has recently become a prominent concept. In contrast to the traditionally used concept quality of service (QoS), QoE not only involves objectively measuring the delivered service but also takes into account user’s needs and desires when using the service, emphasizing the user’s overall acceptability on the service. Many QoE metrics are able to estimate the user perceived quality or acceptability of mobile video, but may be not enough accurate for the overall UX prediction due to the complexity of UX. Only a few frameworks of QoE have addressed more aspects of UX for mobile multimedia applications but need be transformed into practical measures. The challenge of optimizing UX remains adaptations to the resource constrains (e.g., network conditions, mobile device capabilities, and heterogeneous usage contexts) as well as meeting complicated user requirements (e.g., usage purposes and personal preferences). In this chapter, we investigate the existing important UX frameworks, compare their similarities and discuss some important features that fit in the mobile video service. Based on the previous research, we propose a simple UX framework for mobile video application by mapping a variety of influencing factors of UX upon a typical mobile video delivery system. Each component and its factors are explored with comprehensive literature reviews. The proposed framework may benefit in user-centred design of mobile video through taking a complete consideration of UX influences and in improvement of mobile videoservice quality by adjusting the values of certain factors to produce a positive user experience. It may also facilitate relative research in the way of locating important issues to study, clarifying research scopes, and setting up proper study procedures. We then review a great deal of research on UX measurement, including QoE metrics and QoE frameworks of mobile multimedia. Finally, we discuss how to achieve an optimal quality of user experience by focusing on the issues of various aspects of UX of mobile video. In the conclusion, we suggest some open issues for future study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wireless sensor network system must have the ability to tolerate harsh environmental conditions and reduce communication failures. In a typical outdoor situation, the presence of wind can introduce movement in the foliage. This motion of vegetation structures causes large and rapid signal fading in the communication link and must be accounted for when deploying a wireless sensor network system in such conditions. This thesis examines the fading characteristics experienced by wireless sensor nodes due to the effect of varying wind speed in a foliage obstructed transmission path. It presents extensive measurement campaigns at two locations with the approach of a typical wireless sensor networks configuration. The significance of this research lies in the varied approaches of its different experiments, involving a variety of vegetation types, scenarios and the use of different polarisations (vertical and horizontal). Non–line of sight (NLoS) scenario conditions investigate the wind effect based on different vegetation densities including that of the Acacia tree, Dogbane tree and tall grass. Whereas the line of sight (LoS) scenario investigates the effect of wind when the grass is swaying and affecting the ground-reflected component of the signal. Vegetation type and scenarios are envisaged to simulate real life working conditions of wireless sensor network systems in outdoor foliated environments. The results from the measurements are presented in statistical models involving first and second order statistics. We found that in most of the cases, the fading amplitude could be approximated by both Lognormal and Nakagami distribution, whose m parameter was found to depend on received power fluctuations. Lognormal distribution is known as the result of slow fading characteristics due to shadowing. This study concludes that fading caused by variations in received power due to wind in wireless sensor networks systems are found to be insignificant. There is no notable difference in Nakagami m values for low, calm, and windy wind speed categories. It is also shown in the second order analysis, the duration of the deep fades are very short, 0.1 second for 10 dB attenuation below RMS level for vertical polarization and 0.01 second for 10 dB attenuation below RMS level for horizontal polarization. Another key finding is that the received signal strength for horizontal polarisation demonstrates more than 3 dB better performances than the vertical polarisation for LoS and near LoS (thin vegetation) conditions and up to 10 dB better for denser vegetation conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While changes in work and employment practices in the mining sector have been profound, the literature addressing mining work is somewhat partial as it focuses primarily on the workplace as the key (or only) site of analysis, leaving the relationship between mining work and families and communities under-theorized. This article adopts a spatially oriented, case-study approach to the sudden closure of the Ravensthorpe nickel mine in the south-west of Western Australia to explore the interplay between the new scales and mobilities of labour and capital and work–family–community connections in mining. In the context of the dramatically reconfigured industrial arena of mining work, the study contributes to a theoretical engagement between employment relations and the spatial dimensions of family and community in resource-affected communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The epithelium of the corneolimbus contains stem cells for regenerating the corneal epithelium. Diseases and injuries affecting the limbus can lead to a condition known as limbal stem cell deficiency (LSCD), which results in loss of the corneal epithelium, and subsequent chronic inflammation and scarring of the ocular surface. Advances in the treatment of LSCD have been achieved through use of cultured human limbal epithelial (HLE) grafts to restore epithelial stem cells of the ocular surface. These epithelial grafts are usually produced by the ex vivo expansion of HLE cells on human donor amniotic membrane (AM), but this is not without limitations. Although AM is the most widely accepted substratum for HLE transplantation, donor variation, risk of disease transfer, and rising costs have led to the search for alternative biomaterials to improve the surgical outcome of LSCD. Recent studies have demonstrated that Bombyx mori silk fibroin (hereafter referred to as fibroin) membranes support the growth of primary HLE cells, and thus this thesis aims to explore the possibility of using fibroin as a biomaterial for ocular surface reconstruction. Optimistically, the grafted sheets of cultured epithelium would provide a replenishing source of epithelial progenitor cells for maintaining the corneal epithelium, however, the HLE cells lose their progenitor cell characteristics once removed from their niche. More severe ocular surface injuries, which result in stromal scarring, damage the epithelial stem cell niche, which subsequently leads to poor corneal re-epithelialisation post-grafting. An ideal solution to repairing the corneal limbus would therefore be to grow and transplant HLE cells on a biomaterial that also provides a means for replacing underlying stromal cells required to better simulate the normal stem cell niche. The recent discovery of limbal mesenchymal stromal cells (L-MSC) provides a possibility for stromal repair and regeneration, and therefore, this thesis presents the use of fibroin as a possible biomaterial to support a three dimensional tissue engineered corneolimbus with both an HLE and underlying L-MSC layer. Investigation into optimal scaffold design is necessary, including adequate separation of epithelial and stromal layers, as well as direct cell-cell contact. Firstly, the attachment, morphology and phenotype of HLE cells grown on fibroin were directly compared to that observed on donor AM, the current clinical standard substrate for HLE transplantation. The production, transparency, and permeability of fibroin membranes were also evaluated in this part of the study. Results revealed that fibroin membranes could be routinely produced using a custom-made film casting table and were found to be transparent and permeable. Attachment of HLE cells to fibroin after 4 hours in serum-free medium was similar to that supported by tissue culture plastic but approximately 6-fold less than that observed on AM. While HLE cultured on AM displayed superior stratification, epithelia constructed from HLE on fibroin maintained evidence of corneal phenotype (cytokeratin pair 3/12 expression; CK3/12) and displayed a comparable number and distribution of ÄNp63+ progenitor cells to that seen in cultures grown on AM. These results confirm the suitability of membranes constructed from silk fibroin as a possible substrate for HLE cultivation. One of the most important aspects in corneolimbal tissue engineering is to consider the reconstruction of the limbal stem cell niche to help form the natural limbus in situ. MSC with similar properties to bone marrow derived-MSC (BM-MSC) have recently been grown from the limbus of the human cornea. This thesis evaluated methods for culturing L-MSC and limbal keratocytes using various serum-free media. The phenotype of resulting cultures was examined using photography, flow cytometry for CD34 (keratocyte marker), CD45 (bone marrow-derived cell marker), CD73, CD90, CD105 (collectively MSC markers), CD141 (epithelial/vascular endothelial marker), and CD271 (neuronal marker), immunocytochemistry (alpha-smooth muscle actin; á-sma), differentiation assays (osteogenesis, adipogenesis and chrondrogenesis), and co-culture experiments with HLE cells. While all techniques supported to varying degrees establishment of keratocyte and L-MSC cultures, sustained growth and serial propagation was only achieved in serum-supplemented medium or the MesenCult-XF„¥ culture system (Stem Cell Technologies). Cultures established in MesenCult-XF„¥ grew faster than those grown in serum-supplemented medium and retained a more optimal MSC phenotype. L-MSC cultivated in MesenCult-XFR were also positive for CD141, rarely expressed £\-sma, and displayed multi-potency. L-MSC supported growth of HLE cells, with the largest epithelial islands being observed in the presence of L-MSC established in MesenCult-XF„¥ medium. All HLE cultures supported by L-MSC widely expressed the progenitor cell marker £GNp63, along with the corneal differentiation marker CK3/12. Our findings conclude that MesenCult-XFR is a superior culture system for L-MSC, but further studies are required to explore the significance of CD141 expression in these cells. Following on from the findings of the previous two parts, silk fibroin was tested as a novel dual-layer construct containing both an epithelium and underlying stroma for corneolimbal reconstruction. In this section, the growth and phenotype of HLE cells on non-porous versus porous fibroin membranes was compared. Furthermore, the growth of L-MSC in either serum-supplemented medium or the MesenCult-XFR culture system within fibroin fibrous mats was investigated. Lastly, the co-culture of HLE and L-MSC in serum-supplemented medium on and within fibroin dual-layer constructs was also examined. HLE on porous membranes displayed a flattened and squamous monolayer; in contrast, HLE on non-porous fibroin appeared cuboidal and stratified closer in appearance to a normal corneal epithelium. Both constructs maintained CK3/12 expression and distribution of £GNp63+ progenitor cells. Dual-layer fibroin scaffolds consisting of HLE cells and L-MSC maintained a similar phenotype as on the single layers alone. Overall, the present study proposed to create a three dimensional limbal tissue substitute of HLE cells and L-MSC together, ultimately for safe and beneficial transplantation back into the human eye. The results show that HLE and L-MSC can be cultivated separately and together whilst maintaining a clinically feasible phenotype containing a majority of progenitor cells. In addition, L-MSC were able to be cultivated routinely in the MesenCult-XF® culture system while maintaining a high purity for the MSC characteristic phenotype. However, as a serum-free culture medium was not found to sustain growth of both HLE and L-MSC, the combination scaffold was created in serum-supplemented medium, indicating that further refinement of this cultured limbal scaffold is required. This thesis has also demonstrated a potential novel marker for L-MSC, and has generated knowledge which may impact on the understanding of stromal-epithelial interactions. These results support the feasibility of a dual-layer tissue engineered corneolimbus constructed from silk fibroin, and warrant further studies into the potential benefits it offers to corneolimbal tissue regeneration. Further refinement of this technology should explore the potential benefits of using epithelial-stromal co-cultures with MesenCult-XF® derived L-MSC. Subsequent investigations into the effects of long-term culture on the phenotype and behaviour of the cells in the dual-layer scaffolds are also required. While this project demonstrated the feasibility in vitro for the production of a dual-layer tissue engineered corneolimbus, further studies are required to test the efficacy of the limbal scaffold in vivo. Future in vivo studies are essential to fully understand the integration and degradation of silk fibroin biomaterials in the cornea over time. Subsequent experiments should also investigate the use of both AM and silk fibroin with epithelial and stromal cell co-cultures in an animal model of LSCD. The outcomes of this project have provided a foundation for research into corneolimbal reconstruction using biomaterials and offer a stepping stone for future studies into corneolimbal tissue engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the significance of sources around measurement sites, assist the development of control strategies for the important sources and mitigate the adverse effects of air pollution due to particle size. Methods: In this study, sampling was conducted at two sites located in urban/industrial and residential areas situated at roadsides along the Brisbane Urban Corridor. Ultrafine and fine particle measurements obtained at the two sites in June-July 2002 were analysed by Positive Matrix Factorization (PMF). Results: Six sources were present, including local traffic, two traffic sources, biomass burning, and two currently unidentified sources. Secondary particles had a significant impact at Site 1, while nitrates, peak traffic hours and main roads located close to the source also affected the results for both sites. Conclusions: This significant traffic corridor exemplifies the type of sources present in heavily trafficked locations and future attempts to control pollution in this type of environment could focus on the sources that were identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sound tagging has been studied for years. Among all sound types, music, speech, and environmental sound are three hottest research areas. This survey aims to provide an overview about the state-of-the-art development in these areas.We discuss about the meaning of tagging in different sound areas at the beginning of the journey. Some examples of sound tagging applications are introduced in order to illustrate the significance of this research. Typical tagging techniques include manual, automatic, and semi-automatic approaches.After reviewing work in music, speech and environmental sound tagging, we compare them and state the research progress to date. Research gaps are identified for each research area and the common features and discriminations between three areas are discovered as well. Published datasets, tools used by researchers, and evaluation measures frequently applied in the analysis are listed. In the end, we summarise the worldwide distribution of countries dedicated to sound tagging research for years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant and machinery valuation is important to every company.s annual financial reporting. It is reported under the non-current assets section, and the valuers are generally employed to provide the up to date valuation of the non-current assets valuation such as property, plant and equipment that can make up to 80% of the total assets of a company. The valuation of plant and machinery is also important for other purposes such as securing loan facilities, sales, takeover, insurance and auction. The application of 2005 International Financial Reporting Standard (IFRS) has a subsequent impact on the financial sector, as a whole. The accountants have to choose between the Historical Cost approach and Market Value approach in determining the value of the client.s assets. In Malaysia, the implementation of IFRS has a domino effect on the financial system, especially for plant and machinery valuation for financial reporting. The comparison data for plant and machinery valuation is limited unlike land and building valuation. The question of Malaysian valuer.s ability to comply with the IFRS standard keeps rising every day, not just to the accountants, but also other related parties such as financial institutions, government agencies and the clients. This is happening because of different interpretations of premise of value for plant and machinery, as well as methods been used and differences in standards of reporting among the valuers conducting plant and machinery valuation. The root of the problem lies in the lack of practical guidelines governing plant and machinery valuation practices and different schools of thought among the valuers. Some follow the United Kingdom.s RICS guidelines, whilst some valuers are more comfortable with the United State.s USPAP rules, especially on the premise of value. This research is to investigate the international best practices of plant and machinery valuation and to establish the common valuation concept, awareness and application of valuation methodology and valuation process for plant and machinery valuation in Malaysia. This research uses a combination of the qualitative and quantitative research approach. In the qualitative approach, the content analyses were conducted from the international practices and current Malaysian implementation of plant and machinery valuation. A survey (quantitative approach) via questionnaire was implemented among the registered and probationary valuers in Malaysia to investigate their understanding and opinion relating to plant and machinery valuation based on the current practices. The significance of this research is the identification of international plant and machinery practices and the understanding of current practices of plant and machinery valuation in Malaysia. It is found that issues embedding plant and machinery valuation practices are limited numbers of resources available either from scholars or practitioner. This is supported by the general finding from the research survey that indicates that there are immediate needs for practical notes or guidelines to be developed and implemented to support the Malaysian valuers practising plant and machinery valuation. This move will lead to a better understanding of plant and machinery valuation, reducing discrepancies in valuation of plant and machinery and increased accuracy among practising valuers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historic house museums form a significant component of the built heritage and social history of a country. They vary from the elaborate mansions of the wealthy to modest dwellings of the working class. Regardless of the original owner's status in society these house museums are vital to an understanding of architecture, culture and society from a bygone era. The Newstead House, the oldest surviving residence, in Brisbane, is the first house to be designated a 'Historic House Museum' in Queensland. It is a representative example of a house that demonstrates the British colonial heritage of 19th century Australia. Originally a modest cottage, on 34 acres of land, the Newstead house was built by a Scottish migrant. The ownership of the house and land changed many times, during the period from 1847 to 1939. During this period a series of prominent residents of Brisbane either owned or rented this residence. They included, an officer of the Royal Navy, politicians, magistrates, merchant ship owners, and a Consul General of the United States of America. As a result, the house went through a series of renovations and extensions to accommodate the needs of its owners and their position in society. This paper aims to investigate the significance of historic museum houses in educating the community on aspects of social history, culture and architecture of 19th century Australia. It will focus on the heritage listed Newstead House as a case study to demonstrate the significance of the house as an artefact and an educational tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To investigate the factors associated with sudden infant death syndrome (SIDS) from birth to age 2 years, whether recent advice has been followed, whether any new risk factors have emerged, and the specific circumstances in which SIDS occurs while cosleeping (infant sharing the same bed or sofa with an adult or child). Design Four year population based case-control study. Parents were interviewed shortly after the death or after the reference sleep (within 24 hours) of the two control groups. Setting South west region of England (population 4.9 million, 184 800 births). Participants 80 SIDS infants and two control groups weighted for age and time of reference sleep: 87 randomly selected controls and 82 controls at high risk of SIDS (young, socially deprived, multiparous mothers who smoked). Results The median age at death (66 days) was more than three weeks less than in a study in the same region a decade earlier. Of the SIDS infants, 54% died while cosleeping compared with 20% among both control groups. Much of this excess may be explained by a significant multivariable interaction between cosleeping and recent parental use of alcohol or drugs (31% v 3% random controls) and the increased proportion of SIDS infants who had coslept on a sofa (17% v 1%). One fifth of SIDS infants used a pillow for the last sleep (21% v 3%) and one quarter were swaddled (24% v 6%). More mothers of SIDS infants than random control infants smoked during pregnancy (60% v 14%), whereas one quarter of the SIDS infants were preterm (26% v 5%) or were in fair or poor health for the last sleep (28% v 6%). All of these differences were significant in the multivariable analysis regardless of which control group was used for comparison. The significance of covering the infant’s head, postnatal exposure to tobacco smoke, dummy use, and sleeping in the side position has diminished although a significant proportion of SIDS infants were still found prone (29% v 10%). Conclusions Many of the SIDS infants had coslept in a hazardous environment. The major influences on risk, regardless of markers for socioeconomic deprivation, are amenable to change and specific advice needs to be given, particularly on use of alcohol or drugs before cosleeping and cosleeping on a sofa.