965 resultados para Gap model
Resumo:
Scientists need to transfer semantically similar queries across multiple heterogeneous linked datasets. These queries may require data from different locations and the results are not simple to combine due to differences between datasets. A query model was developed to make it simple to distribute queries across different datasets using RDF as the result format. The query model, based on the concept of publicly recognised namespaces for parts of each scientific dataset, was implemented with a configuration that includes a large number of current biological and chemical datasets. The configuration is flexible, providing the ability to transparently use both private and public datasets in any query. A prototype implementation of the model was used to resolve queries for the Bio2RDF website, including both Bio2RDF datasets and other datasets that do not follow the Bio2RDF URI conventions.
Resumo:
This paper attempts to develop a theoretical acceptance model for measuring Web personalization success. Key factors impacting Web personalization acceptance are identified from a detailed literature review. The final model is then cast in a structural equation modeling (SEM) framework comprising nineteen manifest variables, which are grouped into three focal behaviors of Web users. These variables could provide a framework for better understanding of numerous factors that contribute to the success measures of Web personalization technology. Especially, those concerning the quality of personalized features and how personalized information through personalized Website can be delivered to the user. The interrelationship between success constructs is also explained. Empirical validations of this theoretical model are expected on future research.
Resumo:
Creative Commons (CC) is often seen as a social movement, dismissed by critics as a tool for hobbyists or academics who do not sell their creations to make a living. However, this paper argues that the licensing of creative copyright works under a CC licence does not preclude commercial gain. If used wisely, CC licences can be a useful tool for creators in their quest for commercial success. In particular, this paper argues that the sharing of creative works online under a CC licence allows creators to circumvent traditional distribution channels dominated by content intermediaries, whilst maintaining a level of control over their copyright works (i.e. explicitly reserving some rights but not all rights). This will be illustrated by case studies on how CC is being used by content creators and intermediaries respectively, and how successful their respective methods are in harnessing this tool.
Resumo:
In the past two decades there has been increasing interest in branding tourism destinations in an effort to meaningfully differentiate against a myriad of competing places that offer similar attractions and facilities. The academic literature relating to destination branding commenced only as recently as 1998, and there remains a dearth of empirical data that tests the effectiveness of brand campaigns, particularly in terms of enhancing destination loyalty. This paper reports the results of an investigation into destination brand loyalty for Australia as a long haul destination in a South American market. In spite of the high level of academic interest in the measurement of perceptions of destinations since the 1970s, few previous studies have examined perceptions held by South American consumers. Drawing on a model of consumer-based brand equity (CBBE), antecedents of destination brand loyalty was tested with data from a large Chilean sample of travelers, comprising a mix of previous visitors and non-visitors to Australia. Findings suggest that destination brand awareness, brand image, and brand value are positively related to brand loyalty for a long-haul destination. However, destination brand quality was not significantly related. The results also indicate that Australia is a more compelling destination brand for previous visitors compared to non-visitors.
Resumo:
While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with the ever-growing complexity of business process models. Mechanisms for dealing with this complexity can be classified into two categories: 1) those that are solely concerned with the visual representation of the model and 2) those that change its inner structure. While significant attention is paid to the latter category in the BPM literature, this paper focuses on the former category. It presents a collection of patterns that generalize and conceptualize various existing mechanisms to change the visual representation of a process model. Next, it provides a detailed analysis of the degree of support for these patterns in a number of state-of-the-art languages and tools. This paper concludes with the results of a usability evaluation of the patterns conducted with BPM practitioners.
Resumo:
A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modelling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for configurable process modelling is restricted, thus hindering their applicability. Specifically, these notations focus on capturing tasks and control-flow dependencies, neglecting equally important ingredients of business processes such as data and resources. This research fills this gap by proposing a configurable process modelling notation incorporating features for capturing resources, data and physical objects involved in the performance of tasks. The proposal has been implemented in a toolset that assists analysts during the configuration phase and guarantees the correctness of the resulting process models. The approach has been validated by means of a case study from the film industry.
Resumo:
This article argues for exploring lesbian, gay, bisexual, and transgender (LGBT) young people’s experiences with police. While research examines how factors such as indigeneity influence young peoples’ experiences with police, how sexuality and/or gender identity mediates these relationships remains largely unexplored. Key bodies of research suggest a need to explore this area further, including: literature documenting links between homophobic violence against LGBT young people and outcomes such as homelessness that fall within the gambit of policing work; research showing reluctance of LGBT communities to report crime to police; international research documenting homophobic police attitudes and Australian research demonstrating arguably homophobic court outcomes; and research outlining increasing police support of LGBT communities. Drawing on these bodies of literature, this article argues that LGBT young people experience policing warrants further research.
Resumo:
Erythromycin is the standard antibiotic used for treatment of Ureaplasma species during 3 pregnancy; however, maternally administered erythromycin may be ineffective at eliminating 4 intra-amniotic ureaplasma infections. We asked if erythromycin would eradicate intra-amniotic 5 ureaplasma infections in pregnant sheep. At 50 days of gestation (d, term=150d) pregnant ewes 6 received intra-amniotic injections of erythromycin-sensitive U. parvum serovar 3 (n=16) or 10B 7 medium (n=16). At 100d, amniocentesis was performed; five fetal losses (ureaplasma group: 8 n=4; 10B group: n=1) had occurred by this time. Remaining ewes were allocated into treatment 9 subgroups: medium only (M, n=7); medium and erythromycin (M/E, n=8); ureaplasma only (Up, 10 n=6) or ureaplasma and erythromycin (Up/E, n=6). Erythromycin was administered intra11 muscularly (500 mg), eight-hourly for four days (100d-104d). Amniotic fluid samples were 12 collected at 105d. At 125d preterm fetuses were surgically delivered and specimens were 13 collected for culture and histology. Erythromycin was quantified in amniotic fluid by liquid 14 chromatography-mass spectrometry. Ureaplasmas were isolated from the amniotic fluid, 15 chorioamnion and fetal lung of animals from the Up and Up/E groups, however, the numbers of 16 U. parvum recovered were not different between these groups. Inflammation in the 17 chorioamnion, cord and fetal lung was increased in ureaplasma-exposed animals compared to 18 controls, but was not different between the Up and Up/E groups. Erythromycin was detected in 19 amniotic fluid samples, although concentrations were low (<10-76 ng/mL). This study 20 demonstrates that maternally administered erythromycin does not eradicate chronic, intra- amniotic ureaplasma infections or improve fetal outcomes in an ovine model, potentially due to 22 the poor placental passage of erythromycin.
Resumo:
The field of literacy studies has always been challenged by the changing technologies that humans have used to express, represent and communicate their feelings, ideas, understandings and knowledge. However, while the written word has remained central to literacy processes over a long period, it is generally accepted that there have been significant changes to what constitutes ‘literate’ practice. In particular, the status of the printed word has been challenged by the increasing dominance of the image, along with the multimodal meaning-making systems facilitated by digital media. For example, Gunther Kress and other members of the New London Group have argued that the second half of the twentieth century saw a significant cultural shift from the linguistic to the visual as the dominant semiotic mode. This in turn, they suggest, was accompanied by a cultural shift ‘from page to screen’ as a dominant space of representation (e.g. Cope & Kalantzis, 2000; Kress, 2003; New London Group, 1996). In a similar vein, Bill Green has noted that we have witnessed a shift from the regime of the print apparatus to a regime of the digital electronic apparatus (Lankshear, Snyder and Green, 2000). For these reasons, the field of literacy education has been challenged to find new ways to conceptualise what is meant by ‘literacy’ in the twenty first century and to rethink the conditions under which children might best be taught to be fully literate so that they can operate with agency in today’s world.
Resumo:
In Australia as far back as 1993, researchers such as Baladin and Chapmen reported that "18% of the total Australian population and 51% of the population over 60 years of age were identified as having a disability" (2001; p38.2). Statistics such as these are not by any means astonishing, even to members of the general public, and it is widely understood that these are only to increase significantly in our near future. What is particularly surprising however is, in the face of such statistics, the lack of new and creative responses to this demographic shift, particularly by the architecture and construction industries. The common response from a range of sectors seems to be the repetition of a series of models which offer limited, and often undesirable, housing options. It is this against this backdrop, characterized by a lack of original options from mainstream practitioners and relevant government bodies, that the need has arisen to develop alternative models at grass-roots level. This paper reports primarily on the work of one group comprising a not-for-profit organization, a pro-bono design practice group and a local university working together to design a more holistic, emotionally sustainable independent living model of housing for families where a member of the family has a disability. This approach recognizes the limitations of universal design in that it often does not " ... meet all the housing needs that arise for people with moderate to severe disabilities" (Scotts, Margie et al, 2007; p.17). It is hoped that by examining the work of such a collective which is not driven by profit or policy, but rather born with the aim to address first and foremost individual and community need, that better insight can be gained into the real requirements of individuals and families as well as open up a view to new ways of fulfilling them.
Resumo:
A Simulink Matlab control system of a heavy vehicle suspension has been developed. The aim of the exercise presented in this paper was to develop a Simulink Matlab control system of a heavy vehicle suspension. The objective facilitated by this outcome was the use of a working model of a heavy vehicle (HV) suspension that could be used for future research. A working computer model is easier and cheaper to re-configure than a HV axle group installed on a truck; it presents less risk should something go wrong and allows more scope for variation and sensitivity analysis before embarking on further "real-world" testing. Empirical data recorded as the input and output signals of a heavy vehicle (HV) suspension were used to develop the parameters for computer simulation of a linear time invariant system described by a second-order differential equation of the form: (i.e. a "2nd-order" system). Using the empirical data as an input to the computer model allowed validation of its output compared with the empirical data. The errors ranged from less than 1% to approximately 3% for any parameter, when comparing like-for-like inputs and outputs. The model is presented along with the results of the validation. This model will be used in future research in the QUT/Main Roads project Heavy vehicle suspensions – testing and analysis, particularly so for a theoretical model of a multi-axle HV suspension with varying values of dynamic load sharing. Allowance will need to be made for the errors noted when using the computer models in this future work.
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.
Resumo:
BACKGROUND: Grafting of autologous hyaline cartilage and bone for articular cartilage repair is a well-accepted technique. Although encouraging midterm clinical results have been reported, no information on the mechanical competence of the transplanted joint surface is available. HYPOTHESIS: The mechanical competence of osteochondral autografts is maintained after transplantation. STUDY DESIGN: Controlled laboratory study. METHODS: Osteochondral defects were filled with autografts (7.45 mm in diameter) in one femoral condyle in 12 mature sheep. The ipsilateral femoral condyle served as the donor site, and the resulting defect (8.3 mm in diameter) was left empty. The repair response was examined after 3 and 6 months with mechanical and histologic assessment and histomorphometric techniques. RESULTS: Good surface congruity and plug placement was achieved. The Young modulus of the grafted cartilage significantly dropped to 57.5% of healthy tissue after 3 months (P < .05) but then recovered to 82.2% after 6 months. The aggregate and dynamic moduli behaved similarly. The graft edges showed fibrillation and, in some cases (4 of 6), hypercellularity and chondrocyte clustering. Subchondral bone sclerosis was observed in 8 of 12 cases, and the amount of mineralized bone in the graft area increased from 40% to 61%. CONCLUSIONS: The mechanical quality of transplanted cartilage varies considerably over a short period of time, potentially reflecting both degenerative and regenerative processes, while histologically signs of both cartilage and bone degeneration occur. CLINICAL RELEVANCE: Both the mechanically degenerative and restorative processes illustrate the complex progression of regeneration after osteochondral transplantation. The histologic evidence raises doubts as to the long-term durability of the osteochondral repair.
Resumo:
Although player enjoyment is central to computer games, there is currently no accepted model of player enjoyment in games. There are many heuristics in the literature, based on elements such as the game interface, mechanics, gameplay, and narrative. However, there is a need to integrate these heuristics into a validated model that can be used to design, evaluate, and understand enjoyment in games. We have drawn together the various heuristics into a concise model of enjoyment in games that is structured by flow. Flow, a widely accepted model of enjoyment, includes eight elements that, we found, encompass the various heuristics from the literature. Our new model, GameFlow, consists of eight elements -- concentration, challenge, skills, control, clear goals, feedback, immersion, and social interaction. Each element includes a set of criteria for achieving enjoyment in games. An initial investigation and validation of the GameFlow model was carried out by conducting expert reviews of two real-time strategy games, one high-rating and one low-rating, using the GameFlow criteria. The result was a deeper understanding of enjoyment in real-time strategy games and the identification of the strengths and weaknesses of the GameFlow model as an evaluation tool. The GameFlow criteria were able to successfully distinguish between the high-rated and low-rated games and identify why one succeeded and the other failed. We concluded that the GameFlow model can be used in its current form to review games; further work will provide tools for designing and evaluating enjoyment in games.
Resumo:
A distinctive feature of Chinese test is that a Chinese document is a sequence of Chinese with no space or boundary between Chinese words. This feature makes Chinese information retrieval more difficult since a retrieved document which contains the query term as a sequence of Chinese characters may not be really relevant to the query since the query term (as a sequence Chinese characters) may not be a valid Chinese word in that documents. On the other hand, a document that is actually relevant may not be retrieved because it does not contain the query sequence but contains other relevant words. In this research, we propose a hybrid Chinese information retrieval model by incorporating word-based techniques with the traditional character-based techniques. The aim of this approach is to investigate the influence of Chinese segmentation on the performance of Chinese information retrieval. Two ranking methods are proposed to rank retrieved documents based on the relevancy to the query calculated by combining character-based ranking and word-based ranking. Our experimental results show that Chinese segmentation can improve the performance of Chinese information retrieval, but the improvement is not significant if it incorporates only Chinese segmentation with the traditional character-based approach.