399 resultados para Point Stimulation Method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web service technology is increasingly being used to build various e-Applications, in domains such as e-Business and e-Science. Characteristic benefits of web service technology are its inter-operability, decoupling and just-in-time integration. Using web service technology, an e-Application can be implemented by web service composition — by composing existing individual web services in accordance with the business process of the application. This means the application is provided to customers in the form of a value-added composite web service. An important and challenging issue of web service composition, is how to meet Quality-of-Service (QoS) requirements. This includes customer focused elements such as response time, price, throughput and reliability as well as how to best provide QoS results for the composites. This in turn best fulfils customers’ expectations and achieves their satisfaction. Fulfilling these QoS requirements or addressing the QoS-aware web service composition problem is the focus of this project. From a computational point of view, QoS-aware web service composition can be transformed into diverse optimisation problems. These problems are characterised as complex, large-scale, highly constrained and multi-objective problems. We therefore use genetic algorithms (GAs) to address QoS-based service composition problems. More precisely, this study addresses three important subproblems of QoS-aware web service composition; QoS-based web service selection for a composite web service accommodating constraints on inter-service dependence and conflict, QoS-based resource allocation and scheduling for multiple composite services on hybrid clouds, and performance-driven composite service partitioning for decentralised execution. Based on operations research theory, we model the three problems as a constrained optimisation problem, a resource allocation and scheduling problem, and a graph partitioning problem, respectively. Then, we present novel GAs to address these problems. We also conduct experiments to evaluate the performance of the new GAs. Finally, verification experiments are performed to show the correctness of the GAs. The major outcomes from the first problem are three novel GAs: a penaltybased GA, a min-conflict hill-climbing repairing GA, and a hybrid GA. These GAs adopt different constraint handling strategies to handle constraints on interservice dependence and conflict. This is an important factor that has been largely ignored by existing algorithms that might lead to the generation of infeasible composite services. Experimental results demonstrate the effectiveness of our GAs for handling the QoS-based web service selection problem with constraints on inter-service dependence and conflict, as well as their better scalability than the existing integer programming-based method for large scale web service selection problems. The major outcomes from the second problem has resulted in two GAs; a random-key GA and a cooperative coevolutionary GA (CCGA). Experiments demonstrate the good scalability of the two algorithms. In particular, the CCGA scales well as the number of composite services involved in a problem increases, while no other algorithms demonstrate this ability. The findings from the third problem result in a novel GA for composite service partitioning for decentralised execution. Compared with existing heuristic algorithms, the new GA is more suitable for a large-scale composite web service program partitioning problems. In addition, the GA outperforms existing heuristic algorithms, generating a better deployment topology for a composite web service for decentralised execution. These effective and scalable GAs can be integrated into QoS-based management tools to facilitate the delivery of feasible, reliable and high quality composite web services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a study of the academic numeracy of nursing students. This study develops a theoretical model for the design and delivery of university courses in academic numeracy. The following objectives are addressed: 1. To investigate nursing students' current knowledge of academic numeracy; 2. To investigate how nursing students’ knowledge and skills in academic numeracy can be enhanced using a developmental psychology framework; and 3. To utilise data derived from meeting objectives 1 and 2 to develop a theoretical model to embed academic numeracy in university programs. This study draws from Valsiner’s Human Development Theory (Valsiner, 1997, 2007). It is a quasi-experimental intervention case study (Faltis, 1997) and takes a multimethod approach using pre- and post-tests; observation notes; and semi-structured teaching sessions to document a series of microgenetic studies of student numeracy. Each microgenetic study is centered on the lived experience of students becoming more numerate. The method for this section is based on Vygotsky’s double stimulation (Valsiner, 2000a; 2007). Data collection includes interviews on students’ past experience with mathematics; their present feelings and experiences and how these present feelings and experiences are transformed. The findings from this study have provided evidence that the course developed for nursing students, underpinned by an appropriate framework, does improve academic numeracy. More specifically, students improved their content knowledge of and confidence in mathematics in areas that were directly related to their degree. The study used Valsiner’s microgenetic approach to development to trace the course as it was being taught and two students’ personal academic numeracy journeys. It highlighted particularly troublesome concepts, then outlined scaffolding and pathways used to develop understanding. This approach to academic numeracy development was summarised into a four-faceted model at the university, program, course and individual level. This model can be applied successfully to similar contexts. Thus the thesis advances both theory and practice in this under-researched and under-theorised area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a new graph-theory and improved genetic algorithm based practical method is employed to solve the optimal sectionalizer switch placement problem. The proposed method determines the best locations of sectionalizer switching devices in distribution networks considering the effects of presence of distributed generation (DG) in fitness functions and other optimization constraints, providing the maximum number of costumers to be supplied by distributed generation sources in islanded distribution systems after possible faults. The proposed method is simulated and tested on several distribution test systems in both cases of with DG and non DG situations. The results of the simulations validate the proposed method for switch placement of the distribution network in the presence of distributed generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, because of the new developments in sustainable engineering and renewable energy, which are usually governed by a series of fractional partial differential equations (FPDEs), the numerical modelling and simulation for fractional calculus are attracting more and more attention from researchers. The current dominant numerical method for modeling FPDE is Finite Difference Method (FDM), which is based on a pre-defined grid leading to inherited issues or shortcomings including difficulty in simulation of problems with the complex problem domain and in using irregularly distributed nodes. Because of its distinguished advantages, the meshless method has good potential in simulation of FPDEs. This paper aims to develop an implicit meshless collocation technique for FPDE. The discrete system of FPDEs is obtained by using the meshless shape functions and the meshless collocation formulation. The stability and convergence of this meshless approach are investigated theoretically and numerically. The numerical examples with regular and irregular nodal distributions are used to validate and investigate accuracy and efficiency of the newly developed meshless formulation. It is concluded that the present meshless formulation is very effective for the modeling and simulation of fractional partial differential equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We read the excellent review of telemonitoring in chronic heart failure (CHF)1 with interest and commend the authors on the proposed classification of telemedical remote management systems according to the type of data transfer, decision ability and level of integration. However, several points require clarification in relation to our Cochrane review of telemonitoring and structured telephone support2. We included a study by Kielblock3. We corresponded directly with this study team specifically to find out whether or not this was a randomised study and were informed that it was a randomised trial, albeit by date of birth. We note in our review2 that this randomisation method carries a high risk of bias. Post-hoc metaanalyses without these data demonstrate no substantial change to the effect estimates for all cause mortality (original risk ratio (RR) 0·66 [95% CI 0·54, 0·81], p<0·0001; revised RR 0·72 [95% CI 0·57, 0·92], p=0·008), all-cause hospitalisation (original RR 0·91 [95% CI 0·84, 0·99] p=0·02; revised RR 0.92 [95% CI 0·84, 1·02], p=0·10 ) or CHF-related hospitalisation (original RR 0·79 [95% CI 0·67, 0·94] p=0·008; revised RR 0·75 [95% CI 0·60, 0·94] p=0·01). Secondly, we would classify the Tele-HF study4, 5 as structured telephone support, rather than telemonitoring. Again, inclusion of these data alters the point-estimate but not the overall result of the meta-analyses4. Finally, our review2 does not include invasive telemonitoring as the search strategy was not designed to capture these studies. Therefore direct comparison of our review findings with recent studies of these interventions is not recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modelling an environmental process involves creating a model structure and parameterising the model with appropriate values to accurately represent the process. Determining accurate parameter values for environmental systems can be challenging. Existing methods for parameter estimation typically make assumptions regarding the form of the Likelihood, and will often ignore any uncertainty around estimated values. This can be problematic, however, particularly in complex problems where Likelihoods may be intractable. In this paper we demonstrate an Approximate Bayesian Computational method for the estimation of parameters of a stochastic CA. We use as an example a CA constructed to simulate a range expansion such as might occur after a biological invasion, making parameter estimates using only count data such as could be gathered from field observations. We demonstrate ABC is a highly useful method for parameter estimation, with accurate estimates of parameters that are important for the management of invasive species such as the intrinsic rate of increase and the point in a landscape where a species has invaded. We also show that the method is capable of estimating the probability of long distance dispersal, a characteristic of biological invasions that is very influential in determining spread rates but has until now proved difficult to estimate accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thin solid films were extensively used in the making of solar cells, cutting tools, magnetic recording devices, etc. As a result, the accurate measurement of mechanical properties of the thin films, such as hardness and elastic modulus, was required. The thickness of thin films normally varies from tens of nanometers to several micrometers. It is thus challenging to measure their mechanical properties. In this study, a nanoscratch method was proposed for hardness measurement. A three-dimensional finite element method (3-D FEM) model was developed to validate the nanoscratch method and to understand the substrate effect during nanoscratch. Nanoindentation was also used for comparison. The nanoscratch method was demonstrated to be valuable for measuring hardness of thin solid films.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comprehensive study to find the most efficient bitrate requirement to deliver mobile video that optimizes bandwidth, while at the same time maintains good user viewing experience. In the study, forty participants were asked to choose the lowest quality video that would still provide for a comfortable and long-term viewing experience, knowing that higher video quality is more expensive and bandwidth intensive. This paper proposes the lowest pleasing bitrates and corresponding encoding parameters for five different content types: cartoon, movie, music, news and sports. It also explores how the lowest pleasing quality is influenced by content type, image resolution, bitrate, and user gender, prior viewing experience, and preference. In addition, it analyzes the trajectory of users’ progression while selecting the lowest pleasing quality. The findings reveal that the lowest bitrate requirement for a pleasing viewing experience is much higher than that of the lowest acceptable quality. Users’ criteria for the lowest pleasing video quality are related to the video’s content features, as well as its usage purpose and the user’s personal preferences. These findings can provide video providers guidance on what quality they should offer to please mobile users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is one of the few studies in the academic literature that directly addresses inward exporting of customer services, which is a topic that has gained less attention from an international services marketing point of view. The objective of this study is to explore the drivers of satisfaction and dissatisfaction for overseas service customers of higher education in Australia. Critical incident technique (CIT) method was used to collect and analyse the data and a total of 107 critical incidents were collected. Findings from this study show that service satisfaction and dissatisfaction for international students derive from: elements of the core service (educational service performance), personal sources (international student performance), and the external environment (socialization and host environment performance). Additionally, results show that the drivers of satisfaction and dissatisfaction for international students are not necessarily the same. Limitations relating to the specific sector of higher education and the cross sectional natures of the data are addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new relationship type of social networks - online dating - are gaining popularity. With a large member base, users of a dating network are overloaded with choices about their ideal partners. Recommendation methods can be utilized to overcome this problem. However, traditional recommendation methods do not work effectively for online dating networks where the dataset is sparse and large, and a two-way matching is required. This paper applies social networking concepts to solve the problem of developing a recommendation method for online dating networks. We propose a method by using clustering, SimRank and adapted SimRank algorithms to recommend matching candidates. Empirical results show that the proposed method can achieve nearly double the performance of the traditional collaborative filtering and common neighbor methods of recommendation.