897 resultados para end-to-end testing, javascript, application web, single-page application


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper attempts to develop a theoretical acceptance model for measuring Web personalization success. Key factors impacting Web personalization acceptance are identified from a detailed literature review. The final model is then cast in a structural equation modeling (SEM) framework comprising nineteen manifest variables, which are grouped into three focal behaviors of Web users. These variables could provide a framework for better understanding of numerous factors that contribute to the success measures of Web personalization technology. Especially, those concerning the quality of personalized features and how personalized information through personalized Website can be delivered to the user. The interrelationship between success constructs is also explained. Empirical validations of this theoretical model are expected on future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Occlusion is a big challenge for facial expression recognition (FER) in real-world situations. Previous FER efforts to address occlusion suffer from loss of appearance features and are largely limited to a few occlusion types and single testing strategy. This paper presents a robust approach for FER in occluded images and addresses these issues. A set of Gabor based templates is extracted from images in the gallery using a Monte Carlo algorithm. These templates are converted into distance features using template matching. The resulting feature vectors are robust to occlusion. Occluded eyes and mouth regions and randomly places occlusion patches are used for testing. Two testing strategies analyze the effects of these occlusions on the overall recognition performance as well as each facial expression. Experimental results on the Cohn-Kanade database confirm the high robustness of our approach and provide useful insights about the effects of occlusion on FER. Performance is also compared with previous approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To obtain minimum time or minimum energy trajectories for robots it is necessary to employ planning methods which adequately consider the platform’s dynamic properties. A variety of sampling, graph-based or local receding-horizon optimisation methods have previously been proposed. These typically use simplified kino-dynamic models to avoid the significant computational burden of solving this problem in a high dimensional state-space. In this paper we investigate solutions from the class of pseudospectral optimisation methods which have grown in favour amongst the optimal control community in recent years. These methods have high computational efficiency and rapid convergence properties. We present a practical application of such an approach to the robot path planning problem to provide a trajectory considering the robot’s dynamic properties. We extend the existing literature by augmenting the path constraints with sensed obstacles rather than predefined analytical functions to enable real world application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stigmergy is a biological term used when discussing insect or swarm behaviour, and describes a model supporting environmental communication separately from artefacts or agents. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, or similarly termites and their termite mound building process. What is interesting with this mechanism is that highly organized societies are achieved with a lack of any apparent management structure. Stigmergic behavior is implicit in the Web where the volume of users provides a self-organizing and self-contextualization of content in sites which facilitate collaboration. However, the majority of content is generated by a minority of the Web participants. A significant contribution from this research would be to create a model of Web stigmergy, identifying virtual pheromones and their importance in the collaborative process. This paper explores how exploiting stigmergy has the potential of providing a valuable mechanism for identifying and analyzing online user behavior recording actionable knowledge otherwise lost in the existing web interaction dynamics. Ultimately this might assist our building better collaborative Web sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human hair fibres are ubiquitous in nature and are found frequently at crime scenes often as a result of exchange between the perpetrator, victim and/or the surroundings according to Locard's Principle. Therefore, hair fibre evidence can provide important information for crime investigation. For human hair evidence, the current forensic methods of analysis rely on comparisons of either hair morphology by microscopic examination or nuclear and mitochondrial DNA analyses. Unfortunately in some instances the utilisation of microscopy and DNA analyses are difficult and often not feasible. This dissertation is arguably the first comprehensive investigation aimed to compare, classify and identify the single human scalp hair fibres with the aid of FTIR-ATR spectroscopy in a forensic context. Spectra were collected from the hair of 66 subjects of Asian, Caucasian and African (i.e. African-type). The fibres ranged from untreated to variously mildly and heavily cosmetically treated hairs. The collected spectra reflected the physical and chemical nature of a hair from the near-surface particularly, the cuticle layer. In total, 550 spectra were acquired and processed to construct a relatively large database. To assist with the interpretation of the complex spectra from various types of human hair, Derivative Spectroscopy and Chemometric methods such as Principal Component Analysis (PCA), Fuzzy Clustering (FC) and Multi-Criteria Decision Making (MCDM) program; Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and Geometrical Analysis for Interactive Aid (GAIA); were utilised. FTIR-ATR spectroscopy had two important advantages over to previous methods: (i) sample throughput and spectral collection were significantly improved (no physical flattening or microscope manipulations), and (ii) given the recent advances in FTIR-ATR instrument portability, there is real potential to transfer this work.s findings seamlessly to on-field applications. The "raw" spectra, spectral subtractions and second derivative spectra were compared to demonstrate the subtle differences in human hair. SEM images were used as corroborative evidence to demonstrate the surface topography of hair. It indicated that the condition of the cuticle surface could be of three types: untreated, mildly treated and treated hair. Extensive studies of potential spectral band regions responsible for matching and discrimination of various types of hair samples suggested the 1690-1500 cm-1 IR spectral region was to be preferred in comparison with the commonly used 1750-800 cm-1. The principal reason was the presence of the highly variable spectral profiles of cystine oxidation products (1200-1000 cm-1), which contributed significantly to spectral scatter and hence, poor hair sample matching. In the preferred 1690-1500 cm-1 region, conformational changes in the keratin protein attributed to the α-helical to β-sheet transitions in the Amide I and Amide II vibrations and played a significant role in matching and discrimination of the spectra and hence, the hair fibre samples. For gender comparison, the Amide II band is significant for differentiation. The results illustrated that the male hair spectra exhibit a more intense β-sheet vibration in the Amide II band at approximately 1511 cm-1 whilst the female hair spectra displayed more intense α-helical vibration at 1520-1515cm-1. In terms of chemical composition, female hair spectra exhibit greater intensity of the amino acid tryptophan (1554 cm-1), aspartic and glutamic acid (1577 cm-1). It was also observed that for the separation of samples based on racial differences, untreated Caucasian hair was discriminated from Asian hair as a result of having higher levels of the amino acid cystine and cysteic acid. However, when mildly or chemically treated, Asian and Caucasian hair fibres are similar, whereas African-type hair fibres are different. In terms of the investigation's novel contribution to the field of forensic science, it has allowed for the development of a novel, multifaceted, methodical protocol where previously none had existed. The protocol is a systematic method to rapidly investigate unknown or questioned single human hair FTIR-ATR spectra from different genders and racial origin, including fibres of different cosmetic treatments. Unknown or questioned spectra are first separated on the basis of chemical treatment i.e. untreated, mildly treated or chemically treated, genders, and racial origin i.e. Asian, Caucasian and African-type. The methodology has the potential to complement the current forensic analysis methods of fibre evidence (i.e. Microscopy and DNA), providing information on the morphological, genetic and structural levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Columns are one of the key load bearing elements that are highly susceptible to vehicle impacts. The resulting severe damages to columns may leads to failures of the supporting structure that are catastrophic in nature. However, the columns in existing structures are seldom designed for impact due to inadequacies of design guidelines. The impact behaviour of columns designed for gravity loads and actions other than impact is, therefore, of an interest. A comprehensive investigation is conducted on reinforced concrete column with a particular focus on investigating the vulnerability of the exposed columns and to implement mitigation techniques under low to medium velocity car and truck impacts. The investigation is based on non-linear explicit computer simulations of impacted columns followed by a comprehensive validation process. The impact is simulated using force pulses generated from full scale vehicle impact tests. A material model capable of simulating triaxial loading conditions is used in the analyses. Circular columns adequate in capacity for five to twenty story buildings, designed according to Australian standards are considered in the investigation. The crucial parameters associated with the routine column designs and the different load combinations applied at the serviceability stage on the typical columns are considered in detail. Axially loaded columns are examined at the initial stage and the investigation is extended to analyse the impact behaviour under single axis bending and biaxial bending. The impact capacity reduction under varying axial loads is also investigated. Effects of the various load combinations are quantified and residual capacity of the impacted columns based on the status of the damage and mitigation techniques are also presented. In addition, the contribution of the individual parameter to the failure load is scrutinized and analytical equations are developed to identify the critical impulses in terms of the geometrical and material properties of the impacted column. In particular, an innovative technique was developed and introduced to improve the accuracy of the equations where the other techniques are failed due to the shape of the error distribution. Above all, the equations can be used to quantify the critical impulse for three consecutive points (load combinations) located on the interaction diagram for one particular column. Consequently, linear interpolation can be used to quantify the critical impulse for the loading points that are located in-between on the interaction diagram. Having provided a known force and impulse pair for an average impact duration, this method can be extended to assess the vulnerability of columns for a general vehicle population based on an analytical method that can be used to quantify the critical peak forces under different impact durations. Therefore the contribution of this research is not only limited to produce simplified yet rational design guidelines and equations, but also provides a comprehensive solution to quantify the impact capacity while delivering new insight to the scientific community for dealing with impacts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we advocate for the continued need for consumer protection and fair trading regulation, even in competitive markets. For the purposes of this paper a ‘competitive market’ is defined as one that has low barriers to entry and exit, with homogenous products and services and numerous suppliers. Whilst competition is an important tool for providing consumer benefits, it will not be sufficient to protect at least some consumers, particularly vulnerable, low income consumers. For this reason, we argue, setting competition as the ‘end goal’ and assuming that consumer protection and consumer benefits will always follow, is a flawed regulatory approach. The ‘end goal’ should surely be consumer protection and fair markets, and a combination of competition law and consumer protection law should be applied in order to achieve those goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analytical expressions are derived for the mean and variance, of estimates of the bispectrum of a real-time series assuming a cosinusoidal model. The effects of spectral leakage, inherent in discrete Fourier transform operation when the modes present in the signal have a nonintegral number of wavelengths in the record, are included in the analysis. A single phase-coupled triad of modes can cause the bispectrum to have a nonzero mean value over the entire region of computation owing to leakage. The variance of bispectral estimates in the presence of leakage has contributions from individual modes and from triads of phase-coupled modes. Time-domain windowing reduces the leakage. The theoretical expressions for the mean and variance of bispectral estimates are derived in terms of a function dependent on an arbitrary symmetric time-domain window applied to the record. the number of data, and the statistics of the phase coupling among triads of modes. The theoretical results are verified by numerical simulations for simple test cases and applied to laboratory data to examine phase coupling in a hypothesis testing framework

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experimental laboratory investigation was carried out to assess the structural adequacy of a disused PHO Class Flat Bottom Rail Wagon (FRW) for a single lane low volume road bridge application as per the design provisions of the Australian Bridge Design Standard AS 5100(2004). The investigation also encompassed a review into the risk associated with the pre-existing damage in wagons incurred during their service life on rail. The main objective of the laboratory testing of the FRW was to physically measure its performance under the same applied traffic loading it would be required to resist as a road bridge deck. In order to achieve this a full width (5.2m) single lane, single span (approximately 10m), simply supported bridge would be required to be constructed and tested in a structural laboratory. However, the available clear spacing between the columns of the loading portal frame encountered within the laboratory was insufficient to accommodate the 5.2m wide bridge deck excluding clearance normally considered necessary in structural testing. Therefore, only half of the full scale bridge deck (single FRW of width 2.6m) was able to be accommodated and tested; with the continuity of the bridge deck in the lateral direction applied as boundary constraints along the full length of the FRW at six selected locations. This represents a novel approach not yet reported in the literature for bridge deck testing to the best of the knowledge of the author. The test was carried out under two loadings provided in AS 5100 (2004) – one stationary W80 wheel load and the second a moving axle load M1600. As the bridge investigated in the study is a single lane single span low volume road bridge, the risk of pre-existing damage and the expected high cycle fatigue failure potential was assessed as being minimal and hence the bridge deck was not tested structurally for fatigue/ fracture. The high axle load requirements have instead been focussed upon the investigation into the serviceability and ultimate limit state requirements. The testing regime adopted however involved extensive recording of strains and deflections at several critical locations of the FRW. Three locations of W80 point load and two locations of the M1600 Axle load were considered for the serviceability testing; the FRW was also tested under the ultimate load dictated by the M1600. The outcomes of the experimental investigation have demonstrated that the FRW is structurally adequate to resist the prescribed traffic loadings outlaid in AS 5100 (2004). As the loading was directly applied on to the FRW, the laboratory testing is assessed as being significantly conservative. The FRW bridge deck in the field would only resist the load transferred by the running platform, where, depending on the design, composite action might exist – thereby the share of the loading which needs to be resisted by the FRW would be smaller than the system tested in the lab. On this basis, a demonstration bridge is under construction at the time of writing this thesis and future research will involve field testing in order to assess its performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades, assessment practices within Australian law schools have moved from the overwhelming use of end-of-year closed-book examinations to an increase in the use of a wider range of techniques. This shift is often characterised as providing a ‘better’ learning environment for students, contributing more positively to their own ‘personal development’ within higher education, or, considered along the lines of critical legal thought, as ‘liberating’ them from the ‘conservatising’ and ‘indoctrinating’ effects of the power relations that operate in law schools. This paper seeks to render problematic such liberal-progressive narratives about these changes to law school assessment practices. It will do so by utilising the work of French historian and philosopher Michel Foucault on power, arguing that the current range of assessment techniques demonstrates a shift in the ‘economy’ of power relations within the law school. Rather than ‘liberating’ students from relations of power, these practices actually extend the power relations through which students are governed. This analysis is intended to inform legal education research and assessment practice by providing a far more nuanced conceptual framework than one that seeks to ‘free’ law students from these ‘repressive’ practices, or hopes to ‘objectively’ contribute to their ‘personal development’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently the application of the quasi-steady-state approximation (QSSA) to the stochastic simulation algorithm (SSA) was suggested for the purpose of speeding up stochastic simulations of chemical systems that involve both relatively fast and slow chemical reactions [Rao and Arkin, J. Chem. Phys. 118, 4999 (2003)] and further work has led to the nested and slow-scale SSA. Improved numerical efficiency is obtained by respecting the vastly different time scales characterizing the system and then by advancing only the slow reactions exactly, based on a suitable approximation to the fast reactions. We considerably extend these works by applying the QSSA to numerical methods for the direct solution of the chemical master equation (CME) and, in particular, to the finite state projection algorithm [Munsky and Khammash, J. Chem. Phys. 124, 044104 (2006)], in conjunction with Krylov methods. In addition, we point out some important connections to the literature on the (deterministic) total QSSA (tQSSA) and place the stochastic analogue of the QSSA within the more general framework of aggregation of Markov processes. We demonstrate the new methods on four examples: Michaelis–Menten enzyme kinetics, double phosphorylation, the Goldbeter–Koshland switch, and the mitogen activated protein kinase cascade. Overall, we report dramatic improvements by applying the tQSSA to the CME solver.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part-time work presents a conundrum. Across industrialised countries, there has been significant growth in part-time work as a solution to resolving the diverse interests of employers, workers and families in managing time and resources. However, there are intrinsic disadvantages associated with part-time work; notably with pay and career prospects, which impact the same stakeholders it is intended to benefit. These disadvantages are particularly evident in professional services organisations, due to strong cultural norms of long work hours, single-focused commitment to work and 24x7 availability. There are indications, both in research and practice, that the design of part-time work arrangements could be improved to address some of the disadvantages associated with part-time work, and to challenge norms and dated assumptions that influence the structure of professional work. This study explored the changes made when professional service workers move from a full-time to part-time arrangement. The study drew on a recently proposed framework for work design, which extended previous models to reflect substantial changes in the contemporary work environment. The framework proved to be a useful perspective from which to explore the design of part-time work, principally because it integrated previously disconnected areas of literature and practice through a broader focus on the context of work. Exploration of the differences between part-time and full-time roles, and comparisons between part-time roles in similar types of work, provided insights into the design of professional part-time work. Analysis revealed that having a better understanding of design characteristics may help explain disadvantages associated with professional part-time work, and that some full-time roles can be more easily adapted to part-time arrangements than others. Importantly, comparisons revealed that even roles that are considered difficult to undertake on a part-time basis can potentially be re-designed to be more effective. Through empirical testing of the framework, a contextualised work design model is presented that may guide further research and the practice of crafting part-time arrangements. The findings also suggest that poor work design may lead to the symptoms associated with professional part-time work, and that improved work design may be a potential solution to these structural constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intuitively, any ‘bag of words’ approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distributions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document’s initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur’s search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the effect that venture creation action has on the outcomes of nascent entrepreneurship. A theoretical model was developed which proposes action as a fundamental mechanism in venture creation. Thus, action should rightly be considered as a means rather than an end in itself. In this respect, action transmits the effects of venture resource endowments on to venture creation outcomes. This conceptual model was empirically supported in a random sample of nascent ventures. Ventures with higher levels of human or social capital tend to be more active in venture creation. In turn, more active venture attempts are more likely to achieve improved results.