975 resultados para Location model


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A month-long intensive measurement campaign was conducted in March/April 2007 at Agnes Water, a remote coastal site just south of the Great Barrier Reef on the east coast of Australia. Particle and ion size distributions were continuously measured during the campaign. Coastal nucleation events were observed in clean, marine air masses coming from the south-east on 65% of the days. The events usually began at ~10:00 local time and lasted for 1-4 hrs. They were characterised by the appearance of a nucleation mode with a peak diameter of ~10 nm. The freshly nucleated particles grew within 1-4 hrs up to sizes of 20-50 nm. The events occurred when solar intensity was high (~1000 W m-2) and RH was low (~60%). Interestingly, the events were not related to tide height. The volatile and hygroscopic properties of freshly nucleated particles (17-22.5 nm), simultaneously measured with a volatility-hygroscopicity-tandem differential mobility analyser (VH-TDMA), were used to infer chemical composition. The majority of the volume of these particles was attributed to internally mixed sulphate and organic components. After ruling out coagulation as a source of significant particle growth, we conclude that the condensation of sulphate and/or organic vapours was most likely responsible for driving particle growth during the nucleation events. We cannot make any direct conclusions regarding the chemical species that participated in the initial particle nucleation. However, we suggest that nucleation may have resulted from the photo-oxidation products of unknown sulphur or organic vapours emitted from the waters of Hervey Bay, or from the formation of DMS-derived sulphate clusters over the open ocean that were activated to observable particles by condensable vapours emitted from the nutrient rich waters around Fraser Island or Hervey Bay. Furthermore, a unique and particularly strong nucleation event was observed during northerly wind. The event began early one morning (08:00) and lasted almost the entire day resulting in the production of a large number of ~80 nm particles (average modal concentration during the event was 3200 cm-3). The Great Barrier Reef was the most likely source of precursor vapours responsible for this event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New mobile digital communication technologies present opportunities for advertisers to capitalize on the evolving relationships of consumers with their mobile devices and their desire to access enhanced information services while mobile (m-services). Consumers already use mobile devices (cell phones, personal mobile digital assistants) for traditional phone calls and message handling (e.g., Kalakota and Robinson, 2002; Sullivan Mort and Drennan, 2002). The combination of rapidly developing mobile digital technology and high uptake rates of mobile devices presents enormous potential for delivery of m-services through these devices (Bitner, Brown, and Meuter, 2000). M-services encompass a wide variety of types including the ability to trade stock, to book theater and movie tickets while accessing seating plans online, to send and receive text and pictures, and receive personalized direct advertising such as alerts for shopping bargains. Marketing communications, and specifically advertising, may be delivered as an m-service and termed m-services advertising, forming part of the broader category of m-services. However, advertising research has not yet addressed the area of m-services and needs to do so to be able to take advantage of the advanced interactivity (Yadav and Varadarajan, 2005) of mobile communication devices. Such advertising research is likely to help develop open attitudes and responses to new business models as has been advocated for other new technology such as advanced television (Tauder, 2005). In this article, we model the factors influencing the use of m-services, in the context of consumers' existing relationships with mobile devices. First, we address the value propositions underpinning consumer involvement with mobile devices. Next, we canvass the types of involvement relevant to this consumption domain and argue that involvement, together with personal attributes innovativeness and self-efficacy, will influence use of m-services. Finally, implications for advertising delivered as an m-service are discussed, the potential for m-services advertising as part of m-commerce are canvassed, and directions for future research identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To introduce techniques for deriving a map that relates visual field locations to optic nerve head (ONH) sectors and to use the techniques to derive a map relating Medmont perimetric data to data from the Heidelberg Retinal Tomograph. METHODS: Spearman correlation coefficients were calculated relating each visual field location (Medmont M700) to rim area and volume measures for 10 degrees ONH sectors (HRT III software) for 57 participants: 34 with glaucoma, 18 with suspected glaucoma, and 5 with ocular hypertension. Correlations were constrained to be anatomically plausible with a computational model of the axon growth of retinal ganglion cells (Algorithm GROW). GROW generated a map relating field locations to sectors of the ONH. The sector with the maximum statistically significant (P < 0.05) correlation coefficient within 40 degrees of the angle predicted by GROW for each location was computed. Before correlation, both functional and structural data were normalized by either normative data or the fellow eye in each participant. RESULTS: The model of axon growth produced a 24-2 map that is qualitatively similar to existing maps derived from empiric data. When GROW was used in conjunction with normative data, 31% of field locations exhibited a statistically significant relationship. This significance increased to 67% (z-test, z = 4.84; P < 0.001) when both field and rim area data were normalized with the fellow eye. CONCLUSIONS: A computational model of axon growth and normalizing data by the fellow eye can assist in constructing an anatomically plausible map connecting visual field data and sectoral ONH data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores how we may transform peoples’ perceived access to cultural participation by exploiting the possible relationships between place, play and mobile devices. It presents SCOOT; a location-based game in order to investigate how aspects of game-play can be employed to evoke at once playful and culturally meaningful experiences of place. In particular this paper is concerned with how the portable, communicative and social affordances of mobile phones are integral to making a “now everything looks like a game” experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we explore what is required of a User Interface (UI) design in order to encourage participation around playing and creating Location-Based Games (LBGs). To base our research in practice, we present Cipher Cities, a web based system. Through the design of this system, we investigate how UI design can provide tools for complex content creation to compliment and encourage the use of mobile phones for designing, distributing, and playing LBGs. Furthermore we discuss how UI design can promote and support socialisation around LBGs through the design of functional interface components and services such as groups, user profiles, and player status listings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This program of research examines the experience of chronic pain in a community sample. While, it is clear that like patient samples, chronic pain in non-patient samples is also associated with psychological distress and physical disability, the experience of pain across the total spectrum of pain conditions (including acute and episodic pain conditions) and during the early course of chronic pain is less clear. Information about these aspects of the pain experience is important because effective early intervention for chronic pain relies on identification of people who are likely to progress to chronicity post-injury. A conceptual model of the transition from acute to chronic pain was proposed by Gatchel (1991a). In brief, Gatchel’s model describes three stages that individuals who have a serious pain experience move through, each with worsening psychological dysfunction and physical disability. The aims of this program of research were to describe the experience of pain in a community sample in order to obtain pain-specific data on the problem of pain in Queensland, and to explore the usefulness of Gatchel’s Model in a non-clinical sample. Additionally, five risk factors and six protective factors were proposed as possible extensions to Gatchel’s Model. To address these aims, a prospective longitudinal mixed-method research design was used. Quantitative data was collected in Phase 1 via a comprehensive postal questionnaire. Phase 2 consisted of a follow-up questionnaire 3 months post-baseline. Phase 3 consisted of semi-structured interviews with a subset of the original sample 12 months post follow-up, which used qualitative data to provide a further in-depth examination of the experience and process of chronic pain from respondents’ point of view. The results indicate chronic pain is associated with high levels of anxiety and depressive symptoms. However, the levels of disability reported by this Queensland sample were generally lower than those reported by clinical samples and consistent with disability data reported in a New South Wales population-based study. With regard to the second aim of this program of research, while some elements of the pain experience of this sample were consistent with that described by Gatchel’s Model, overall the model was not a good fit with the experience of this non-clinical sample. The findings indicate that passive coping strategies (minimising activity), catastrophising, self efficacy, optimism, social support, active strategies (use of distraction) and the belief that emotions affect pain may be important to consider in understanding the processes that underlie the transition to and continuation of chronic pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Providing support for reversible transformations as a basis for round-trip engineering is a significant challenge in model transformation research. While there are a number of current approaches, they require the underlying transformation to exhibit an injective behaviour when reversing changes. This however, does not serve all practical transformations well. In this paper, we present a novel approach to round-trip engineering that does not place restrictions on the nature of the underlying transformation. Based on abductive logic programming, it allows us to compute a set of legitimate source changes that equate to a given change to the target model. Encouraging results are derived from an initial prototype that supports most concepts of the Tefkat transformation language

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent decisions of the Family Court of Australian reflect concerns over the adversarial nature of the legal process. The processes and procedures of the judicial system militate against a detailed examination of the issues and rights of the parties in dispute. The limitations of the family law framework are particularly demonstrated in disputes over the custody of children where the Court has tended to neglect the rights and interests of the primary carer. An alternative "unified family court" framework will be examined in which the Court pursues a more active and interventionist approach in the determination of family law disputes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has previously been found that complexes comprised of vitronectin and growth factors (VN:GF) enhance keratinocyte protein synthesis and migration. More specifically, these complexes have been shown to significantly enhance the migration of dermal keratinocytes derived from human skin. In view of this, it was thought that these complexes may hold potential as a novel therapy for healing chronic wounds. However, there was no evidence indicating that the VN:GF complexes would retain their effect on keratinocytes in the presence of chronic wound fluid. The studies in this thesis demonstrate for the first time that the VN:GF complexes not only stimulate proliferation and migration of keratinocytes, but also these effects are maintained in the presence of chronic wound fluid in a 2-dimensional (2-D) cell culture model. Whilst the 2-D culture system provided insights into how the cells might respond to the VN:GF complexes, this investigative approach is not ideal as skin is a 3-dimensional (3-D) tissue. In view of this, a 3-D human skin equivalent (HSE) model, which reflects more closely the in vivo environment, was used to test the VN:GF complexes on epidermopoiesis. These studies revealed that the VN:GF complexes enable keratinocytes to migrate, proliferate and differentiate on a de-epidermalised dermis (DED), ultimately forming a fully stratified epidermis. In addition, fibroblasts were seeded on DED and shown to migrate into the DED in the presence of the VN:GF complexes and hyaluronic acid, another important biological factor in the wound healing cascade. This HSE model was then further developed to enable studies examining the potential of the VN:GF complexes in epidermal wound healing. Specifically, a reproducible partial-thickness HSE wound model was created in fully-defined media and monitored as it healed. In this situation, the VN:GF complexes were shown to significantly enhance keratinocyte migration and proliferation, as well as differentiation. This model was also subsequently utilized to assess the wound healing potential of a synthetic fibrin-like gel that had previously been demonstrated to bind growth factors. Of note, keratinocyte re-epitheliasation was shown to be markedly improved in the presence of this 3-D matrix, highlighting its future potential for use as a delivery vehicle for the VN:GF complexes. Furthermore, this synthetic fibrin-like gel was injected into a 4 mm diameter full-thickness wound created in the HSE, both keratinocytes and fibroblasts were shown to migrate into this gel, as revealed by immunofluorescence. Interestingly, keratinocyte migration into this matrix was found to be dependent upon the presence of the fibroblasts. Taken together, these data indicate that reproducible wounds, as created in the HSEs, provide a relevant ex vivo tool to assess potential wound healing therapies. Moreover, the models will decrease our reliance on animals for scientific experimentation. Additionally, it is clear that these models will significantly assist in the development of novel treatments, such as the VN:GF complexes and the synthetic fibrin-like gel described herein, ultimately facilitating their clinical trial in the treatment of chronic wounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic wounds are a significant socioeconomic problem for governments worldwide. Approximately 15% of people who suffer from diabetes will experience a lower-limb ulcer at some stage of their lives, and 24% of these wounds will ultimately result in amputation of the lower limb. Hyperbaric Oxygen Therapy (HBOT) has been shown to aid the healing of chronic wounds; however, the causal reasons for the improved healing remain unclear and hence current HBOT protocols remain empirical. Here we develop a three-species mathematical model of wound healing that is used to simulate the application of hyperbaric oxygen therapy in the treatment of wounds. Based on our modelling, we predict that intermittent HBOT will assist chronic wound healing while normobaric oxygen is ineffective in treating such wounds. Furthermore, treatment should continue until healing is complete, and HBOT will not stimulate healing under all circumstances, leading us to conclude that finding the right protocol for an individual patient is crucial if HBOT is to be effective. We provide constraints that depend on the model parameters for the range of HBOT protocols that will stimulate healing. More specifically, we predict that patients with a poor arterial supply of oxygen, high consumption of oxygen by the wound tissue, chronically hypoxic wounds, and/or a dysfunctional endothelial cell response to oxygen are at risk of nonresponsiveness to HBOT. The work of this paper can, in some way, highlight which patients are most likely to respond well to HBOT (for example, those with a good arterial supply), and thus has the potential to assist in improving both the success rate and hence the costeffectiveness of this therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The weaknesses of ‗traditional‘ modes of instruction in accounting education have been widely discussed. Many contend that the traditional approach limits the ability to provide opportunities for students to raise their competency level and allow them to apply knowledge and skills in professional problem solving situations. However, the recent body of literature suggests that accounting educators are indeed actively experimenting with ‗non-traditional‘ and ‗innovative‘ instructional approaches, where some authors clearly favour one approach over another. But can one instructional approach alone meet the necessary conditions for different learning objectives? Taking into account the ever changing landscape of not only business environments, but also the higher education sector, the premise guiding the collaborators in this research is that it is perhaps counter productive to promote competing dichotomous views of ‗traditional‘ and ‗non-traditional‘ instructional approaches to accounting education, and that the notion of ‗blended learning‘ might provide a useful framework to enhance the learning and teaching of accounting. This paper reports on the first cycle of a longitudinal study, which explores the possibility of using blended learning in first year accounting at one campus of a large regional university. The critical elements of blended learning which emerged in the study are discussed and, consistent with the design-based research framework, the paper also identifies key design modifications for successive cycles of the research.