971 resultados para Convergence model
Resumo:
A distinctive feature of Chinese test is that a Chinese document is a sequence of Chinese with no space or boundary between Chinese words. This feature makes Chinese information retrieval more difficult since a retrieved document which contains the query term as a sequence of Chinese characters may not be really relevant to the query since the query term (as a sequence Chinese characters) may not be a valid Chinese word in that documents. On the other hand, a document that is actually relevant may not be retrieved because it does not contain the query sequence but contains other relevant words. In this research, we propose a hybrid Chinese information retrieval model by incorporating word-based techniques with the traditional character-based techniques. The aim of this approach is to investigate the influence of Chinese segmentation on the performance of Chinese information retrieval. Two ranking methods are proposed to rank retrieved documents based on the relevancy to the query calculated by combining character-based ranking and word-based ranking. Our experimental results show that Chinese segmentation can improve the performance of Chinese information retrieval, but the improvement is not significant if it incorporates only Chinese segmentation with the traditional character-based approach.
Resumo:
Australia’s efforts to transition to a low-emissions economy have stagnated following the successive defeats of the Carbon Pollution Reduction Scheme. This failure should not, however, be regarded as the end of Australia’s efforts to make this transition. In fact, the opportunity now exists for Australia to refine its existing arrangements to enable this transition to occur more effectively. The starting point for this analysis is the legal arrangements applying to the electricity generation sector, which is the largest sectoral emitter of anthropogenic greenhouse gas emissions in Australia. Without an effective strategy to mitigate this sector’s contribution to anthropogenic climate change, it is unlikely that Australia will be able to transition towards a low-emissions economy. It is on this basis that this article assesses the dominant national legal arrangement – the Renewable Energy Target – underpinning the electricity generation sector's efforts to become a low-emissions sector.
Resumo:
Genetic variation is the resource animal breeders exploit in stock improvement programs. Both the process of selection and husbandry practices employed in aquaculture will erode genetic variation levels overtime, hence the critical resource can be lost and this may compromise future genetic gains in breeding programs. The amount of genetic variation in five lines of Sydney Rock Oyster (SRO) that had been selected for QX (Queensland unknown) disease resistance were examined and compared with that in a wild reference population using seven specific SRO microsatellite loci. The five selected lines had significantly lower levels of genetic diversity than did the wild reference population with allelic diversity declining approximately 80%, but impacts on heterozygosity per locus were less severe. Significant deficiencies in heterozygotes were detected at six of the seven loci in both mass selected lines and the wild reference population. Against this trend however, a significant excess of heterozygotes was recorded at three loci Sgo9, Sgo14 and Sgo21 in three QX disease resistant lines (#2, #5 and #13). All populations were significantly genetic differentiated from each other based on pairwise FST values. A neighbour joining tree based on DA genetic distances showed a clear separation between all culture and wild populations. Results of this study show clearly, that the impacts of the stock improvement program for SRO has significantly eroded natural levels of genetic variation in the culture lines. This could compromise long-term genetic gains and affect sustainability of the SRO breeding program over the long-term.
Resumo:
The Texas Transportation Commission (“the Commission”) is responsible for planning and making policies for the location, construction, and maintenance of a comprehensive system of highways and public roads in Texas. In order for the Commission to carry out its legislative mandate, the Texas Constitution requires that most revenue generated by motor vehicle registration fees and motor fuel taxes be used for constructing and maintaining public roadways and other designated purposes. The Texas Department of Transportation (TxDOT) assists the Commission in executing state transportation policy. It is the responsibility of the legislature to appropriate money for TxDOT’s operation and maintenance expenses. All money authorized to be appropriated for TxDOT’s operations must come from the State Highway Fund (also known as Fund 6, Fund 006, or Fund 0006). The Commission can then use the balance in the fund to fulfill its responsibilities. However, the value of the revenue received in Fund 6 is not keeping pace with growing demand for transportation infrastructure in Texas. Additionally, diversion of revenue to nontransportation uses now exceeds $600 million per year. As shown in Figure 1.1, revenues and expenditures of the State Highway Fund per vehicle mile traveled (VMT) in Texas have remained almost flat since 1993. In the meantime, construction cost inflation has gone up more than 100%, effectively halving the value of expenditure.
Resumo:
This research report documents work conducted by the Center for Transportation (CTR) at The University of Texas at Austin in analyzing the Joint Analysis using the Combined Knowledge (J.A.C.K.) program. This program was developed by the Texas Department of Transportation (TxDOT) to make projections of revenues and expenditures. This research effort was to span from September 2008 to August 2009, but the bulk of the work was completed and presented by December 2008. J.A.C.K. was subsequently renamed TRENDS, but for consistency with the scope of work, the original name is used throughout this report.
Resumo:
Effective use of information and communication technologies (ICT) is necessary for delivering efficiency and improved project delivery in the construction industry. Convincing clients or contracting organisations to embrace ICT is a difficult task, there are few templates of an ICT business model for the industry to use. ICT application in the construction industry is relatively low compared to automotive and aerospace industries. The National Museum of Australia project provides a unique opportunity for investigating and reporting on this deficiency in publicly available knowledge. Concentrates on the business model content and objectives, briefly indicates the evaluation framework that was used to evaluate ICT effectiveness.
Resumo:
Competitive markets are increasingly driving new initiatives for shorter cycle times resulting in increased overlapping of project phases. This, in turn, necessitates improving the interfaces between the different phases to be overlapped (integrated), thus allowing transfer of processes, information and knowledge from one individual or team to another. This transfer between phases, within and between projects, is one of the basic challenges to the philosophy of project management. To make the process transfer more transparent with minimal loss of momentum and project knowledge, this paper draws upon Total Quality Management (TQM) and Business Process Re-engineering (BPR) philosophies to develop a Best Practice Model for managing project phase integration. The paper presents the rationale behind the model development and outlines its two key parts; (1) Strategic Framework and (2) Implementation Plan. Key components of both the Strategic Framework and the Implementation Plan are presented and discussed.
Resumo:
How does the image of the future operate upon history, and upon national and individual identities? To what extent are possible futures colonized by the image? What are the un-said futurecratic discourses that underlie the image of the future? Such questions inspired the examination of Japan’s futures images in this thesis. The theoretical point of departure for this examination is Polak’s (1973) seminal research into the theory of the ‘image of the future’ and seven contemporary Japanese texts which offer various alternative images for Japan’s futures, selected as representative of a ‘national conversation’ about the futures of that nation. These seven images of the future are: 1. Report of the Prime Minister’s Commission on Japan’s Goals in the 21st Century—The Frontier Within: Individual Empowerment and Better Governance in the New Millennium, compiled by a committee headed by Japan’s preeminent Jungian psychologist Kawai Hayao (1928-2007); 2. Slow Is Beautiful—a publication by Tsuji Shinichi, in which he re-images Japan as a culture represented by the metaphor of the sloth, concerned with slow and quality-oriented livingry as a preferred image of the future to Japan’s current post-bubble cult of speed and economic efficiency; 3. MuRatopia is an image of the future in the form of a microcosmic prototype community and on-going project based on the historically significant island of Awaji, and established by Japanese economist and futures thinker Yamaguchi Kaoru; 4. F.U.C.K, I Love Japan, by author Tanja Yujiro provides this seven text image of the future line-up with a youth oriented sub-culture perspective on that nation’s futures; 5. IMAGINATION / CREATION—a compilation of round table discussions about Japan’s futures seen from the point of view of Japan’s creative vanguard; 6. Visionary People in a Visionless Country: 21 Earth Connecting Human Stories is a collection of twenty one essays compiled by Denmark born Tokyo resident Peter David Pedersen; and, 7. EXODUS to the Land of Hope, authored by Murakami Ryu, one of Japan’s most prolific and influential writers, this novel suggests a future scenario portraying a massive exodus of Japan’s youth, who, literate with state-of-the-art information and communication technologies (ICTs) move en masse to Japan’s northern island of Hokkaido to launch a cyber-revolution from the peripheries. The thesis employs a Futures Triangle Analysis (FTA) as the macro organizing framework and as such examines both pushes of the present and weights from the past before moving to focus on the pulls to the future represented by the seven texts mentioned above. Inayatullah’s (1999) Causal Layered Analysis (CLA) is the analytical framework used in examining the texts. Poststructuralist concepts derived primarily from the work of Michel Foucault are a particular (but not exclusive) reference point for the analytical approach it encompasses. The research questions which reflect the triangulated analytic matrix are: 1. What are the pushes—in terms of current trends—that are affecting Japan’s futures? 2. What are the historical and cultural weights that influence Japan’s futures? 3. What are the emerging transformative Japanese images of the future discourses, as embodied in actual texts, and what potential do they offer for transformative change in Japan? Research questions one and two are discussed in Chapter five and research question three is discussed in Chapter six. The first two research questions should be considered preliminary. The weights outlined in Chapter five indicate that the forces working against change in Japan are formidable, structurally deep-rooted, wide-spread, and under-recognized as change-adverse. Findings and analyses of the push dimension reveal strong forces towards a potentially very different type of Japan. However it is the seven contemporary Japanese images of the future, from which there is hope for transformative potential, which form the analytical heart of the thesis. In analyzing these texts the thesis establishes the richness of Japan’s images of the future and, as such, demonstrates the robustness of Japan’s stance vis-à-vis the problem of a perceived map-less and model-less future for Japan. Frontier is a useful image of the future, whose hybrid textuality, consisting of government, business, academia, and creative minority perspectives, demonstrates the earnestness of Japan’s leaders in favour of the creation of innovative futures for that nation. Slow is powerful in its aim to reconceptualize Japan’s philosophies of temporality, and build a new kind of nation founded on the principles of a human-oriented and expanded vision of economy based around the core metaphor of slowness culture. However its viability in Japan, with its post-Meiji historical pushes to an increasingly speed-obsessed social construction of reality, could render it impotent. MuRatopia is compelling in its creative hybridity indicative of an advanced IT society, set in a modern day utopian space based upon principles of a high communicative social paradigm, and sustainability. IMAGINATION / CREATION is less the plan than the platform for a new discussion on Japan’s transformation from an econo-centric social framework to a new Creative Age. It accords with emerging discourses from the Creative Industries, which would re-conceive of Japan as a leading maker of meaning, rather than as the so-called guzu, a term referred to in the book meaning ‘laggard’. In total, Love Japan is still the most idiosyncratic of all the images of the future discussed. Its communication style, which appeals to Japan’s youth cohort, establishes it as a potentially formidable change agent in a competitive market of futures images. Visionary People is a compelling image for its revolutionary and subversive stance against Japan’s vision-less political leadership, showing that it is the people, not the futures-making elite or aristocracy who must take the lead and create a new vanguard for the nation. Finally, Murakami’s Exodus cannot be ruled out as a compelling image of the future. Sharing the appeal of Tanja’s Love Japan to an increasingly disenfranchised youth, Exodus portrays a near-term future that is achievable in the here and now, by Japan’s teenagers, using information and communications technologies (ICTs) to subvert leadership, and create utopianist communities based on alternative social principles. The principal contribution from this investigation in terms of theory belongs to that of developing the Japanese image of the future. In this respect, the literature reviews represent a significant compilation, specifically about Japanese futures thinking, the Japanese image of the future, and the Japanese utopia. Though not exhaustive, this compilation will hopefully serve as a useful starting point for future research, not only for the Japanese image of the future, but also for all image of the future research. Many of the sources are in Japanese and their English summations are an added reason to respect this achievement. Secondly, the seven images of the future analysed in Chapter six represent the first time that Japanese image of the future texts have been systematically organized and analysed. Their translation from Japanese to English can be claimed as a significant secondary contribution. What is more, they have been analysed according to current futures methodologies that reveal a layeredness, depth, and overall richness existing in Japanese futures images. Revealing this image-richness has been one of the most significant findings of this investigation, suggesting that there is fertile research to be found from this still under-explored field, whose implications go beyond domestic Japanese concerns, and may offer fertile material for futures thinkers and researchers, Japanologists, social planners, and policy makers.
Resumo:
A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.
Resumo:
The need to develop effective and efficient training programs has been recognised by all sectors engaged in training. In responding to the above need, focus has been directed to developing good competency statements and performance indicators to measure the outcomes. Very little has been done to understand how the competency statements get translated into good performance. To conceptualise this translation process, a representational model based on an information processing paradigm is proposed and discussed. It is argued that learners’ prior knowledge and the effectiveness of the instructional material are two variables that have significant bearing on how effectively the competency knowledge is translated into outcomes. To contextualise the model examples from apprentice training are used.
Resumo:
Forest policy and forestry management in Tasmania have undergone a number of changes in the last thirty years, many explicitly aimed at improving industry sustainability, job security, and forest biodiversity conservation. Yet forestry remains a contentious issue in Tasmania, due to a number of interacting factors, most significant of which is the prevalence of a ‘command and control’ governance approach by policymakers and managers. New approaches such as multiple-stakeholder decision-making, adaptive management, and direct public participation in policymaking are needed. Such an approach has been attempted in Canada in the last decade, through the Canadian Model Forest Program, and may be suitable for Tasmania. This paper seeks to describe what the Canadian Model Forest approach is, how it may be implemented in Tasmania, and what role it may play in the shift to a new forestry paradigm. Until such a paradigm shift occurs contentions and confrontations are likely to continue.
Resumo:
We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.
Resumo:
The functional properties of cartilaginous tissues are determined predominantly by the content, distribution, and organization of proteoglycan and collagen in the extracellular matrix. Extracellular matrix accumulates in tissue-engineered cartilage constructs by metabolism and transport of matrix molecules, processes that are modulated by physical and chemical factors. Constructs incubated under free-swelling conditions with freely permeable or highly permeable membranes exhibit symmetric surface regions of soft tissue. The variation in tissue properties with depth from the surfaces suggests the hypothesis that the transport processes mediated by the boundary conditions govern the distribution of proteoglycan in such constructs. A continuum model (DiMicco and Sah in Transport Porus Med 50:57-73, 2003) was extended to test the effects of membrane permeability and perfusion on proteoglycan accumulation in tissue-engineered cartilage. The concentrations of soluble, bound, and degraded proteoglycan were analyzed as functions of time, space, and non-dimensional parameters for several experimental configurations. The results of the model suggest that the boundary condition at the membrane surface and the rate of perfusion, described by non-dimensional parameters, are important determinants of the pattern of proteoglycan accumulation. With perfusion, the proteoglycan profile is skewed, and decreases or increases in magnitude depending on the level of flow-based stimulation. Utilization of a semi-permeable membrane with or without unidirectional flow may lead to tissues with depth-increasing proteoglycan content, resembling native articular cartilage.
Resumo:
Creative Industries was adopted as a platform in the 90s by the Blair government in the UK to describe the convergence of the arts, media, communication and information technologies as a newly formed cluster, providing economic and cultural capital for the knowledge economy. The philosophy and rhetoric which has grown around this concept (Leadbeater 2000, Castells 2000, Florida 2000, Caves 2000, Hartley 2000) has been influential in re-contextualising culture and the arts in the 21st century. Where governments and educational institutions have embraced the context of the creative industries, it is having a profound effect on the way arts are being positioned, originally as ‘creative content’ for the new economy. Countries and regions which have actively targeted the Creative Industries as an important economic growth factor in a post-industrial environment are numerous, but it is interesting to note that North and South East Asia and Australia have been at the forefront of developing the Creative Industries in its various guises. It could be argued that the initial phase of Creative Industries concentrated on media and communication technologies to provide commercial outcomes in small incubator business models; developing, for example, products for the games industry. Creative Industries is now entering a second phase of development; one in which the broader palette of the arts, though still not at the forefront of debate, is being re-examined. Both phases of Creative Industries have emphasised creativity and innovation as key drivers in the success and effectiveness of this sector, and although the arts by no means has a monopoly on these drivers, it is where they have an important part to play in the creative industries context. Arguably, the second wave of the creative industries acknowledges to a greater extent that commercialisation works in tandem with government and other support in a complex mixed economic model. In relation to the performing arts, the global market has seen an increase in large-scale cultural events such as festivals which are providing employment for the arts industry and multiplier effects in other parts of the economy. Differentiated product is important in this competitive arena and the use of mediated and digitised environments has been able to increase the amount of arts product available to an international market. This changed environment requires the development of new skills for our artists and producers and has given rise to a reappraisal of approaches to arts training and research in the Higher Degree Education sector (Brown 2007, Cunningham 2006). This paper examines pedagogical changes which took place in the first Creative Industries Faculty in the world at Queensland University of Technology as well as the increased opportunities for leading research initiatives. It concludes with the example of an interdisciplinary artwork produced in a creative industries precinct, exemplifying the convergence of arts and communication technologies and that of artistic practice and research.
Resumo:
Estimating and predicting degradation processes of engineering assets is crucial for reducing the cost and insuring the productivity of enterprises. Assisted by modern condition monitoring (CM) technologies, most asset degradation processes can be revealed by various degradation indicators extracted from CM data. Maintenance strategies developed using these degradation indicators (i.e. condition-based maintenance) are more cost-effective, because unnecessary maintenance activities are avoided when an asset is still in a decent health state. A practical difficulty in condition-based maintenance (CBM) is that degradation indicators extracted from CM data can only partially reveal asset health states in most situations. Underestimating this uncertainty in relationships between degradation indicators and health states can cause excessive false alarms or failures without pre-alarms. The state space model provides an efficient approach to describe a degradation process using these indicators that can only partially reveal health states. However, existing state space models that describe asset degradation processes largely depend on assumptions such as, discrete time, discrete state, linearity, and Gaussianity. The discrete time assumption requires that failures and inspections only happen at fixed intervals. The discrete state assumption entails discretising continuous degradation indicators, which requires expert knowledge and often introduces additional errors. The linear and Gaussian assumptions are not consistent with nonlinear and irreversible degradation processes in most engineering assets. This research proposes a Gamma-based state space model that does not have discrete time, discrete state, linear and Gaussian assumptions to model partially observable degradation processes. Monte Carlo-based algorithms are developed to estimate model parameters and asset remaining useful lives. In addition, this research also develops a continuous state partially observable semi-Markov decision process (POSMDP) to model a degradation process that follows the Gamma-based state space model and is under various maintenance strategies. Optimal maintenance strategies are obtained by solving the POSMDP. Simulation studies through the MATLAB are performed; case studies using the data from an accelerated life test of a gearbox and a liquefied natural gas industry are also conducted. The results show that the proposed Monte Carlo-based EM algorithm can estimate model parameters accurately. The results also show that the proposed Gamma-based state space model have better fitness result than linear and Gaussian state space models when used to process monotonically increasing degradation data in the accelerated life test of a gear box. Furthermore, both simulation studies and case studies show that the prediction algorithm based on the Gamma-based state space model can identify the mean value and confidence interval of asset remaining useful lives accurately. In addition, the simulation study shows that the proposed maintenance strategy optimisation method based on the POSMDP is more flexible than that assumes a predetermined strategy structure and uses the renewal theory. Moreover, the simulation study also shows that the proposed maintenance optimisation method can obtain more cost-effective strategies than a recently published maintenance strategy optimisation method by optimising the next maintenance activity and the waiting time till the next maintenance activity simultaneously.