977 resultados para Web testing
Resumo:
The behavioral theory of “entrepreneurial bricolage” attempts to understand what entrepreneurs do when faced with resource constraints. Most research about bricolage, defined as “making do by applying combinations of the resources at hand to new problems and opportunities” (Baker & Nelson 2005: 333), has been qualitative and inductive (Garud & Karnoe, 2003). Although this has created a small body of rich descriptions and interesting insights, little deductive theory has been developed and the relationship between bricolage and firm performance has not been systematically tested. In particular, prior research has suggested bricolage can have both beneficial and harmful effects. Ciborra’s (1996) study of Olivetti suggested that bricolage helped Olivetti to adapt, but simultaneously constrained firm effectiveness. Baker & Nelson (2005) suggested that bricolage may be harmful at very high levels, but more helpful if used judiciously. Other research suggests that firm innovativeness may play an important role in shaping the outcomes of bricolage (Anderson 2008). In this paper, we theorize and provide preliminary test of the bricolage-performance relationship and how it is affected by firm innovativeness.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
Despite the increased offering of online communication channels to support web-based retail systems, there is limited marketing research that investigates how these channels act singly, or in combination with online channels, to influence an individual' s intention to purchase online. If the marketer's strategy is to encourage online transactions, this requires a focus on consumer acceptance of the web-based transaction technology, rather than the purchase of the products per se. The exploratory study reported in this paper examines normative influences from referent groups in an individual's on and offline social communication networks that might affect their intention to use online transaction facilities. The findings suggest that for non-adopters, there is no normative influence from referents in either network. For adopters, one online and one offline referent norm positively influenced this group's intentions to use online transaction facilities. The implications of these findings are discussed together with future research directions.
Resumo:
“Hardware in the Loop” (HIL) testing is widely used in the automotive industry. The sophisticated electronic control units used for vehicle control are usually tested and evaluated using HIL-simulations. The HIL increases the degree of realistic testing of any system. Moreover, it helps in designing the structure and control of the system under test so that it works effectively in the situations that will be encountered in the system. Due to the size and the complexity of interaction within a power network, most research is based on pure simulation. To validate the performance of physical generator or protection system, most testing is constrained to very simple power network. This research, however, examines a method to test power system hardware within a complex virtual environment using the concept of the HIL. The HIL testing for electronic control units and power systems protection device can be easily performed at signal level. But performance of power systems equipments, such as distributed generation systems can not be evaluated at signal level using HIL testing. The HIL testing for power systems equipments is termed here as ‘Power Network in the Loop’ (PNIL). PNIL testing can only be performed at power level and requires a power amplifier that can amplify the simulation signal to the power level. A power network is divided in two parts. One part represents the Power Network Under Test (PNUT) and the other part represents the rest of the complex network. The complex network is simulated in real time simulator (RTS) while the PNUT is connected to the Voltage Source Converter (VSC) based power amplifier. Two way interaction between the simulator and amplifier is performed using analog to digital (A/D) and digital to analog (D/A) converters. The power amplifier amplifies the current or voltage signal of simulator to the power level and establishes the power level interaction between RTS and PNUT. In the first part of this thesis, design and control of a VSC based power amplifier that can amplify a broadband voltage signal is presented. A new Hybrid Discontinuous Control method is proposed for the amplifier. This amplifier can be used for several power systems applications. In the first part of the thesis, use of this amplifier in DSTATCOM and UPS applications are presented. In the later part of this thesis the solution of network in the loop testing with the help of this amplifier is reported. The experimental setup for PNIL testing is built in the laboratory of Queensland University of Technology and the feasibility of PNIL testing has been evaluated using the experimental studies. In the last section of this thesis a universal load with power regenerative capability is designed. This universal load is used to test the DG system using PNIL concepts. This thesis is composed of published/submitted papers that form the chapters in this dissertation. Each paper has been published or submitted during the period of candidature. Chapter 1 integrates all the papers to provide a coherent view of wide bandwidth switching amplifier and its used in different power systems applications specially for the solution of power systems testing using PNIL.
Resumo:
Aim: In the current climate of medical education, there is an ever-increasing demand for and emphasis on simulation as both a teaching and training tool. The objective of our study was to compare the realism and practicality of a number of artificial blood products that could be used for high-fidelity simulation. Method: A literature and internet search was performed and 15 artificial blood products were identified from a variety of sources. One product was excluded due to its potential toxicity risks. Five observers, blinded to the products, performed two assessments on each product using an evaluation tool with 14 predefined criteria including color, consistency, clotting, and staining potential to manikin skin and clothing. Each criterion was rated using a five-point Likert scale. The products were left for 24 hours, both refrigerated and at room temperature, and then reassessed. Statistical analysis was performed to identify the most suitable products, and both inter- and intra-rater variability were examined. Results: Three products scored consistently well with all five assessors, with one product in particular scoring well in almost every criterion. This highest-rated product had a mean rating of 3.6 of 5.0 (95% posterior Interval 3.4-3.7). Inter-rater variability was minor with average ratings varying from 3.0 to 3.4 between the highest and lowest scorer. Intrarater variability was negligible with good agreement between first and second rating as per weighted kappa scores (K = 0.67). Conclusion: The most realistic and practical form of artificial blood identified was a commercial product called KD151 Flowing Blood Syrup. It was found to be not only realistic in appearance but practical in terms of storage and stain removal.
Resumo:
This project report presents the results of a study on wireless communication data transfer rates for a mobile device running a custombuilt construction defect reporting application. The study measured the time taken to transmit data about a construction defect, which included digital imagery and text, in order to assess the feasibility of transferring various types and sizes of data and the ICT-supported construction management applications that could be developed as a consequence. Data transfer rates over GPRS through the Telstra network and WiFi over a private network were compared. Based on the data size and data transfer time, the rate of transfer was calculated to determine the actual data transmission speeds at which the information was being sent using the wireless mobile communication protocols. The report finds that the transmission speeds vary considerably when using GPRS and can be significantly slower than what is advertised by mobile network providers. While WiFi is much faster than GPRS, the limited range of WiFi limits the protocol to residential-scale construction sites.
Resumo:
Maps have been published on the world wide web since its inception (Cartwright, 1999) and are still accessed and viewed by millions of users today (Peterson, 2003). While early webbased GIS products lacked a complete set of cartographic capabilities, the functionality within such systems has significantly increased over recent years. Functionalities once found only in desktop GIS products are now available in web-based GIS applications, for example, data entry, basic editing, and analysis. Applications based on web-GIS are becoming more widespread and the web-based GIS environment is replacing the traditional desktop GIS platforms in many organizations. Therefore, development of a new cartographic method for web-based GIS is vital. The broad aim of this project is to examine and discuss the challenges and opportunities of innovative cartography methods for web-based GIS platforms. The work introduces a recently developed cartographic methodology, which is based on a web-based GIS portal by the Survey of Israel (SOI). The work discusses the prospects and constraints of such methods in improving web-GIS interfaces and usability for the end user. The work also tables the preliminary findings of the initial implementation of the web-based GIS cartographic method within the portal of the Survey of Israel, as well as the applicability of those methods elsewhere.
Resumo:
Decision support systems (DSS) have evolved rapidly during the last decade from stand alone or limited networked solutions to online participatory solutions. One of the major enablers of this change is the fastest growing areas of geographical information system (GIS) technology development that relates to the use of the Internet as a means to access, display, and analyze geospatial data remotely. World-wide many federal, state, and particularly local governments are designing to facilitate data sharing using interactive Internet map servers. This new generation DSS or planning support systems (PSS), interactive Internet map server, is the solution for delivering dynamic maps and GIS data and services via the world-wide Web, and providing public participatory GIS (PPGIS) opportunities to a wider community (Carver, 2001; Jankowski & Nyerges, 2001). It provides a highly scalable framework for GIS Web publishing, Web-based public participatory GIS (WPPGIS), which meets the needs of corporate intranets and demands of worldwide Internet access (Craig, 2002). The establishment of WPPGIS provides spatial data access through a support centre or a GIS portal to facilitate efficient access to and sharing of related geospatial data (Yigitcanlar, Baum, & Stimson, 2003). As more and more public and private entities adopt WPPGIS technology, the importance and complexity of facilitating geospatial data sharing is growing rapidly (Carver, 2003). Therefore, this article focuses on the online public participation dimension of the GIS technology. The article provides an overview of recent literature on GIS and WPPGIS, and includes a discussion on the potential use of these technologies in providing a democratic platform for the public in decision-making.
Resumo:
The challenges of maintaining a building such as the Sydney Opera House are immense and are dependent upon a vast array of information. The value of information can be enhanced by its currency, accessibility and the ability to correlate data sets (integration of information sources). A building information model correlated to various information sources related to the facility is used as definition for a digital facility model. Such a digital facility model would give transparent and an integrated access to an array of datasets and obviously would support Facility Management processes. In order to construct such a digital facility model, two state-of-the-art Information and Communication technologies are considered: an internationally standardized building information model called the Industry Foundation Classes (IFC) and a variety of advanced communication and integration technologies often referred to as the Semantic Web such as the Resource Description Framework (RDF) and the Web Ontology Language (OWL). This paper reports on some technical aspects for developing a digital facility model focusing on Sydney Opera House. The proposed digital facility model enables IFC data to participate in an ontology driven, service-oriented software environment. A proof-of-concept prototype has been developed demonstrating the usability of IFC information to collaborate with Sydney Opera House’s specific data sources using semantic web ontologies.
Resumo:
The management of main material prices of provincial highway project quota has problems of lag and blindness. Framework of provincial highway project quota data MIS and main material price data warehouse were established based on WEB firstly. Then concrete processes of provincial highway project main material prices were brought forward based on BP neural network algorithmic. After that standard BP algorithmic, additional momentum modify BP network algorithmic, self-adaptive study speed improved BP network algorithmic were compared in predicting highway project main prices. The result indicated that it is feasible to predict highway main material prices using BP NN, and using self-adaptive study speed improved BP network algorithmic is the relatively best one.
Resumo:
Cipher Cities was a practice-led research project developed in 3 stages between 2005 and 2007 resulting in the creation of a unique online community, ‘Cipher Cities’, that provides simple authoring tools and processes for individuals and groups to create their own mobile events and event journals, build community profile and participate in other online community activities. Cipher Cities was created to revitalise peoples relationship to everyday places by giving them the opportunity and motivation to create and share complex digital stories in simple and engaging ways. To do so we developed new design processes and methods for both the research team and the end user to appropriate web and mobile technologies. To do so we collaborated with ethnographers, designers and ICT researchers and developers. In teams we ran a series of workshops in a wide variety of cities in Australia to refine an engagement process and to test a series of iteratively developed prototypes to refine the systems that supported community motivation and collaboration. The result of the research is 2 fold: 1. a sophisticated prototype for researchers and designers to further experiment with community engagement methodologies using existing and emerging communications technologies. 2. A ‘human dimensions matrix’. This matrix assists in the identification and modification of place based interventions in the social, technical, spatial, cultural, pedagogical conditions of any given community. This matrix has now become an essential part of a number of subsequent projects and assists design collaborators to successfully conceptualise, generate and evaluate interactive experiences. the research team employed practice-led action research methodologies that involved a collaborative effort across the fields of interaction design and social science, in particular ethnography, in order to: 1. seek, contest, refine a design methodology that would maximise the successful application of a dynamic system to create new kinds of interactions between people, places and artefacts’. 2. To design and deploy an application that intervenes in place-based and mobile technologies and offers people simple interfaces to create and share digital stories. Cipher Cities was awarded 3 separate CRC competitive grants (over $270,000 in total) to assist 3 stages of research covering the development of the Ethnographic Design Methodologies, the development of the tools, and the testing and refinement of both the engagement models and technologies. The resulting methodologies and tools are in the process of being commercialised by the Australasian CRC for Interaction Design.
Resumo:
Edith Penrose’s theory of firm growth postulates that a firm’s current growth rate will be influenced by the adjustment costs of, and changes to a firm’s productive opportunity set arising from, previous growth. Although she explicitly considered the impact of previous organic growth on current organic growth, she was largely silent about the impact of previous acquisitive growth. In this paper we extend Penrose’s work to examine that the relative impact of organic and acquisitive growth on the adjustment costs and productive opportunity set of the firm. Employing a panel of commercially active enterprises in Sweden over a 10 year period our results suggest the following. First, previous organic growth acts as a constraint on current organic growth. Second, previous acquisitive growth has a positive effect on current organic growth. We conclude that organic growth and acquisitive growth constitute two distinct strategic options facing the firm, which have a differential impact on the future organic growth of the firm.
Resumo:
The Internet theoretically enables marketers to personalize a Website to an individual consumer. This article examines optimal Website design from the perspective of personality trait theory and resource-matching theory. The influence of two traits relevant to Internet Web-site processing—sensation seeking and need for cognition— were studied in the context of resource matching and different levels of Web-site complexity. Data were collected at two points of time: personality-trait data and a laboratory experiment using constructed Web sites. Results reveal that (a) subjects prefer Web sites of a medium level of complexity, rather than high or low complexity; (b)high sensation seekers prefer complex visual designs, and low sensation seekers simple visual designs, both in Web sites of medium complexity; and (c) high need-for-cognition subjects evaluated Web sites with high verbal and low visual complexity more favourably.
Resumo:
The enhanced accessibility, affordability and capability of the Internet has created enormous possibilities in terms of designing, developing and implementing innovative teaching methods in the classroom. As existing pedagogies are revamped and new ones are added, there is a need to assess the effectiveness of these approaches from the students’ perspective. For more than three decades, proven qualitative and quantitative research methods associated with learning environments research have yielded productive results for educators. This article presents the findings of a study in which Getsmart, a teacher-designed website, was blended into science and physics lessons at an Australian high school. Students’ perceptions of this environment were investigated, together with differences in the perceptions of students in junior and senior years of schooling. The article also explores the impact of teachers in such an environment. The investigation undertaken in this study also gave an indication of how effective Getsmart was as a teaching model in such environments.
Resumo:
In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.