977 resultados para Web testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays people heavily rely on the Internet for information and knowledge. Wikipedia is an online multilingual encyclopaedia that contains a very large number of detailed articles covering most written languages. It is often considered to be a treasury of human knowledge. It includes extensive hypertext links between documents of the same language for easy navigation. However, the pages in different languages are rarely cross-linked except for direct equivalent pages on the same subject in different languages. This could pose serious difficulties to users seeking information or knowledge from different lingual sources, or where there is no equivalent page in one language or another. In this thesis, a new information retrieval task—cross-lingual link discovery (CLLD) is proposed to tackle the problem of the lack of cross-lingual anchored links in a knowledge base such as Wikipedia. In contrast to traditional information retrieval tasks, cross language link discovery algorithms actively recommend a set of meaningful anchors in a source document and establish links to documents in an alternative language. In other words, cross-lingual link discovery is a way of automatically finding hypertext links between documents in different languages, which is particularly helpful for knowledge discovery in different language domains. This study is specifically focused on Chinese / English link discovery (C/ELD). Chinese / English link discovery is a special case of cross-lingual link discovery task. It involves tasks including natural language processing (NLP), cross-lingual information retrieval (CLIR) and cross-lingual link discovery. To justify the effectiveness of CLLD, a standard evaluation framework is also proposed. The evaluation framework includes topics, document collections, a gold standard dataset, evaluation metrics, and toolkits for run pooling, link assessment and system evaluation. With the evaluation framework, performance of CLLD approaches and systems can be quantified. This thesis contributes to the research on natural language processing and cross-lingual information retrieval in CLLD: 1) a new simple, but effective Chinese segmentation method, n-gram mutual information, is presented for determining the boundaries of Chinese text; 2) a voting mechanism of name entity translation is demonstrated for achieving a high precision of English / Chinese machine translation; 3) a link mining approach that mines the existing link structure for anchor probabilities achieves encouraging results in suggesting cross-lingual Chinese / English links in Wikipedia. This approach was examined in the experiments for better, automatic generation of cross-lingual links that were carried out as part of the study. The overall major contribution of this thesis is the provision of a standard evaluation framework for cross-lingual link discovery research. It is important in CLLD evaluation to have this framework which helps in benchmarking the performance of various CLLD systems and in identifying good CLLD realisation approaches. The evaluation methods and the evaluation framework described in this thesis have been utilised to quantify the system performance in the NTCIR-9 Crosslink task which is the first information retrieval track of this kind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports research into teacher-­‐librarians’ perceptions of using social media and Web 2.0 in teaching and learning. A pilot study was conducted with teacher-­‐librarians in five government schools and five private schools in southeast Queensland. The findings revealed that there was a strong digital divide between government schools and private schools, with government schools suffering severe restrictions on the use of social media and Web 2.0, leading to an unsophisticated use of these technologies. It is argued that internet ‘over-­‐ blocking’ may lead to government school students not being empowered to manage risks in an open internet environment. Furthermore, their use of information for academic and recreational learning may be compromised. This has implications particularly for low socioeconomic students, leading to further inequity in the process and outcomes of Australian education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hepatitis B is a significant public health challenge within some subpopulations in Australia, including Chinese and Vietnamese migrants. There has been limited research on hepatitis B knowledge and actions in these communities. The authors conducted a self-administered survey among 442 Chinese and 433 Vietnamese in Brisbane. Generally, the knowledge is best described as “moderate.” One in 2 could not identify the sexual transmission risk and less than one third knew that sharing foods or drinks did not spread the disease. The majority of Vietnamese (80%) and 60% of Chinese respondents indicated prior testing. Vaccination was reported in 60% of the Vietnamese and in 52% of the Chinese. Knowledge was better among Chinese people who had been tested and vaccinated compared with those who were nontested and nonvaccinated. Only 3.5% of the Chinese, but 11.6% of the Vietnamese, indicated having a positive test result hepatitis B virus. This study helps identify strategies for programs targeting both communities and practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the details of numerical studies on the shear behaviour and strength of lipped channel beams (LCBs) with stiffened web openings. Over the last couple of decades, cold-formed steel beams have been used extensively in residential, industrial and commercial buildings as primary load bearing structural components. Their shear strengths are considerably reduced when web openings are included for the purpose of locating building services. Our research has shown that shear strengths of LCBs were reduced by up to 70% due to the inclusion of web openings. Hence there is a need to improve the shear strengths of LCBs with web openings. A cost effective way to improve the detrimental effects of a large web opening is to attach appropriate stiffeners around the web openings in order to restore the original shear strength and stiffness of LCBs. Hence numerical studies were undertaken to investigate the shear strengths of LCBs with stiffened web openings. In this research, finite element models of LCBs with stiffened web openings in shear were developed to simulate the shear behaviour and strength of LCBs. Various stiffening methods using plate and LCB stud stiffeners attached to LCBs using screw-fastening were attempted. The developed models were then validated by comparing their results with experimental results and used in parametric studies. Both finite element analysis and experimental results showed that the stiffening arrangements recommended by past re-search for cold-formed steel channel beams are not adequate to restore the shear strengths of LCBs with web openings. Therefore new stiffener arrangements were proposed for LCBs with web openings based on experimental and finite element analysis results. This paper presents the details of finite element models and analyses used in this research and the results including the recommended stiffener arrangements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An increasing body of research is highlighting the involvement of illicit drugs in many road fatalities. Deterrence theory has been a core conceptual framework underpinning traffic enforcement as well as interventions designed to reduce road fatalities. Essentially the effectiveness of deterrence-based approaches is predicated on perceptions of certainty, severity, and swiftness of apprehension. However, much less is known about how the awareness of legal sanctions can impact upon the effectiveness of deterrence mechanisms and whether promoting such detection methods can increase the deterrent effect. Nevertheless, the implicit assumption is that individuals aware of the legal sanctions will be more deterred. This study seeks to explore how awareness of the testing method impacts upon the effectiveness of deterrence-based interventions and intentions to drug drive again in the future. In total, 161 participants who reported drug driving in the previous six months took part in the current study. The results show that awareness of testing had a small effect upon increasing perceptions of the certainty of apprehension and severity of punishment. However, awareness was not a significant predictor of intentions to drug drive again in the future. Importantly, higher levels of drug use were a significant predictor of intentions to drug drive in the future. Whilst awareness does have a small effect on deterrence variables, the influence of levels of drug use seems to reduce any deterrent effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - Researchers debate whether tacit knowledge sharing through Information Technology (IT) is actually possible. However, with the advent of social web tools, it has been argued that most shortcomings of tacit knowledge sharing are likely to disappear. This paper has two purposes: firstly, to demonstrate the existing debates in the literature regarding tacit knowledge sharing using IT, and secondly, to identify key research gaps that lay the foundations for future research into tacit knowledge sharing using social web. Design/methodology/approach - This paper reviews current literature on IT-mediated tacit knowledge sharing and opens a discussion on tacit knowledge sharing through the use of social web. Findings - First, the existing schools of thoughts in regards to IT ability for tacit knowledge sharing are introduced. Next, difficulties of sharing tacit knowledge through the use of IT are discussed. Then, potentials and pitfalls of social web tools are presented. Finally, the paper concludes that whilst there are significant theoretical arguments supporting that the social web facilitates tacit knowledge sharing there is a lack of empirical evidence to support these arguments and further work is required. Research limitations/implications - The limitations of the review includes: covering only papers that were published in English, issues of access to full texts of some resources, possibility of missing some resources due to search strings used or limited coverage of databases searched. Originality/value - The paper contributes to the fast growing literature on the intersection of KM and IT particularly by focusing on tacit knowledge sharing in social media space. The paper highlights the need for further studies in this area by discussing the current situation in the literature and disclosing the emerging questions and gaps for future studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of destination brand performance measurement has only emerged in earnest as a field in the tourism literature since 2007. The concept of consumer-based brand equity (CBBE) is gaining favour from services marketing researchers as an alternative to the traditional ‘net-present-value of future earnings’ method of measuring brand equity. The perceptions-based CBBE model also appears suitable for examining destination brand performance, where a financial brand equity valuation on a destination marketing organisation’s (DMO) balance sheet is largely irrelevant. This is the first study to test and compare the model in both short and long haul markets. The paper reports the results of tests of a CBBE model for Australia in a traditional short haul market (New Zealand) and an emerging long haul market (Chile). The data from both samples indicated destination brand salience, brand image, and brand value are positively related to purchase intent for Australia in these two disparate markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Theory of the Growth of The Firm by Edith Penrose, first published in 1959, is a seminal contribution to the field of management. Penrose's intention was to create a theory of firm growth which was logically consistent and empirically tractable (Buckley and Casson, 2007). Much attention, however, has been focused on her unintended contribution to the resource-based view (henceforth RBV) (e.g. Kor and Mahoney, 2004; Lockett and Thompson, 2004) rather than her firm growth theory. We feel that this is unfortunate because despite a rapidly growing body of empirical work, conceptual advancement in growth studies has been limited (Davidsson and Wiklund, 2000; Davidsson et ai., 2006; Delmar, 1997; Storey, 1994). The growth literature frequently references Penrose's work, but little explicit testing of her ideas has been undertaken. This is surprising given that Penrose's work remains the most comprehensive theory of growth to date. One explanation is that she did not formality present her arguments, favouring verbal exposition over formalized models (Lockett, 2005; Lockett and Thompson, 2004). However, the central propositions and conclusions of her theory can be operationalized and empirically tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building and maintaining software are not easy tasks. However, thanks to advances in web technologies, a new paradigm is emerging in software development. The Service Oriented Architecture (SOA) is a relatively new approach that helps bridge the gap between business and IT and also helps systems remain exible. However, there are still several challenges with SOA. As the number of available services grows, developers are faced with the problem of discovering the services they need. Public service repositories such as Programmable Web provide only limited search capabilities. Several mechanisms have been proposed to improve web service discovery by using semantics. However, most of these require manually tagging the services with concepts in an ontology. Adding semantic annotations is a non-trivial process that requires a certain skill-set from the annotator and also the availability of domain ontologies that include the concepts related to the topics of the service. These issues have prevented these mechanisms becoming widespread. This thesis focuses on two main problems. First, to avoid the overhead of manually adding semantics to web services, several automatic methods to include semantics in the discovery process are explored. Although experimentation with some of these strategies has been conducted in the past, the results reported in the literature are mixed. Second, Wikipedia is explored as a general-purpose ontology. The benefit of using it as an ontology is assessed by comparing these semantics-based methods to classic term-based information retrieval approaches. The contribution of this research is significant because, to the best of our knowledge, a comprehensive analysis of the impact of using Wikipedia as a source of semantics in web service discovery does not exist. The main output of this research is a web service discovery engine that implements these methods and a comprehensive analysis of the benefits and trade-offs of these semantics-based discovery approaches.