184 resultados para intelligent agent


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Teaching introductory programming has challenged educators through the years. Although Intelligent Tutoring Systems that teach programming have been developed to try to reduce the problem, none have been developed to teach web programming. This paper describes the design and evaluation of the PHP Intelligent Tutoring System (PHP ITS) which addresses this problem. The evaluation process showed that students who used the PHP ITS showed a significant improvement in test scores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the possibility of using an adaptive tutoring system for beginning programming students. The work involved, designing, developing and evaluating such a system and showing that it was effective in increasing the students’ test scores. In doing so, Artificial Intelligence techniques were used to analyse PHP programs written by students and to provide feedback based on any specific errors made by them. Methods were also included to provide students with the next best exercise to suit their particular level of knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing the smart grid requires combining varied models. As their number increases, so does the complexity of the software. Having a well thought architecture for the software then becomes crucial. This paper presents MODAM, a framework designed to combine agent-based models in a flexible and extensible manner, using well known software engineering design solutions (OSGi specification [1] and Eclipse plugins [2]). Details on how to build a modular agent-based model for the smart grid are given in this paper, illustrated by an example for a small network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate a rapid synthesis of gold nanoparticles using hydroquinone as a reducing agent under acidic conditions without the need for precursor seed particles. The nanoparticle formation process is facilitated by the addition of NaOH to a solution containing HAuCl4 and hydroquinone to locally change the pH; this enhances the reducing capability of hydroquinone to form gold nucleation centres, after which further growth of gold can take place through an autocatalytic mechanism. The stability of the nanoparticles is highly dependent on the initial solution pH, and both the concentration of added NaOH and hydroquinone present in solution. The gold nanoparticles were characterized by UV–visible spectroscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, atomic force microscopy, dynamic light scattering, and zeta potential measurements. It was found that under optimal conditions that stable aqueous suspensions of 20 nm diameter nanoparticles can be achieved where benzoquinone, the oxidized product of hydroquinone, acts as a capping agent preventing nanoparticles aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Data from two randomized phase III trials were analyzed to evaluate prognostic factors and treatment selection in the first-line management of advanced non-small cell lung cancer patients with performance status (PS) 2. Patients and Methods: Patients randomized to combination chemotherapy (carboplatin and paclitaxel) in one trial and single-agent therapy (gemcitabine or vinorelbine) in the second were included in these analyses. Both studies had identical eligibility criteria and were conducted simultaneously. Comparison of efficacy and safety was performed between the two cohorts. A regression analysis identified prognostic factors and subgroups of patients that may benefit from combination or single-agent therapy. Results: Two hundred one patients were treated with combination and 190 with single-agent therapy. Objective responses were 37 and 15%, respectively. Median time to progression was 4.6 months in the combination arm and 3.5 months in the single-agent arm (p < 0.001). Median survival imes were 8.0 and 6.6 months, and 1-year survival rates were 31 and 26%, respectively. Albumin <3.5 g, extrathoracic metastases, lactate dehydrogenase ≥200 IU, and 2 comorbid conditions predicted outcome. Patients with 0-2 risk factors had similar outcomes independent of treatment, whereas patients with 3-4 factors had a nonsignificant improvement in median survival with combination chemotherapy. Conclusion: Our results show that PS2 non-small cell lung cancer patients are a heterogeneous group who have significantly different outcomes. Patients treated with first-line combination chemotherapy had a higher response and longer time to progression, whereas overall survival did not appear significantly different. A prognostic model may be helpful in selecting PS 2 patients for either treatment strategy. © 2009 by the International Association for the Study of Lung Cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Smart Fields programme has been active in Shell over the last decade and has given large benefits. In order to understand the value and to underpin strategies for the future implementation programme, a study was carried out to quantify the benefits to date. This focused on actually achieved value, through increased production or lower costs. This provided an estimate of the total value achieved to date. Future benefits such as increased reserves or continued production gain were recorded separately. The paper describes the process followed in the benefits quantification. It identifies the key solutions and technologies and describes the mechanism used to understand the relation between solutions and value. Examples have been given of value from various assets around the world, in both existing fields and in green fields. Finally, the study provided the methodology for tracking of value. This helps Shell to estimate and track the benefits of the Smart Fields programme at company scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We read with great interest the article entitled “Enhancing drugs absorption through third-degree burn wound eschar” by Manafi et al. [1]. The authors addressed the concern of poor penetration of topically applied anti-microbials through burn eschar and detailed the improvement of this penetration by penetration enhancers. Here, we would like to report the poor penetration of a topical agent into the viable deep dermal layer under burn eschar on a porcine burn model [2]. In burn treatment, a common practice is the topical application of either anti-microbial products or wound enhancing agents. While the activity of anti-microbial products is designed to fight against microbes on the wound surface but with the least toxicity to viable tissue, wound enhancing agents need to reach the viable tissue layer under the burn eschar. Many studies have reported the accelerated healing of superficial burn wounds and skin graft donor sites by the topical application of exogeneous growth factors [3]. It is well known that the efficacy of the penetration of a topical agent on intact skin mostly depends on the molecular size of the product [4] and [5]. While burn injury destroys this epidermal physiological barrier, the coagulated burn tissue layer on the burn wound surface makes it difficult for topical agents to reach viable tissue....

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Six consecutively hatched cohorts and one cohort of pre-hatch eggs of farmed barramundi (Lates calcarifer) from south Australia were examined for Chlamydia-like organisms associated with epitheliocystis. To identify and characterise the bacteria, 59 gill samples and three pre-hatch egg samples were processed for histology, in situ hybridisation and 16S rRNA amplification, sequencing and comprehensive phylogenetic analysis. Cases of epitheliocystis were observed microscopically and characterised by membrane-enclosed basophilic cysts filled with a granular material that caused hypertrophy of the epithelial cells. In situ hybridisation with a Chlamydiales-specific probe lead to specific labelling of the epitheliocystis inclusions within the gill epithelium. Two distinct but closely related 16S rRNA chlamydial sequences were amplified from gill DNA across the seven cohorts, including from pre-hatch eggs. These genotype sequences were found to be novel, sharing 97.1 - 97.5% similarity to the next closest 16S rRNA sequence, Ca. Similichlamydia latridicola, from Australian striped trumpeter. Comprehensive phylogenetic analysis of these genotype sequences against representative members of the Chlamydiales order and against other epitheliocystis agents revealed these Chlamydia-like organisms to be novel and taxonomically placed them within the recently proposed genus Ca. Similichlamydia. Following Fredricks and Relman's molecular postulates and based on these observations, we propose the epitheliocystis agents of barramundi to be known as "Candidatus Similichlamydia laticola" (sp. nov.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three cohorts of farmed yellowtail kingfish (Seriola lalandi) from South Australia were examined for Chlamydia-like organisms associated with epitheliocystis. To characterize the bacteria, 38 gill samples were processed for histopathology, electron microscopy, and 16S rRNA amplification, sequencing, and phylogenetic analysis. Microscopically, the presence of membrane-enclosed cysts was observed within the gill lamellae. Also observed was hyperplasia of the epithelial cells with cytoplasmic vacuolization and fusion of the gill lamellae. Transmission electron microscopy revealed morphological features of the reticulate and intermediate bodies typical of members of the order Chlamydiales. A novel 1,393-bp 16S chlamydial rRNA sequence was amplified from gill DNA extracted from fish in all cohorts over a 3-year period that corresponded to the 16S rRNA sequence amplified directly from laser-dissected cysts. This sequence was only 87% similar to the reported "Candidatus Piscichlamydia salmonis" (AY462244) from Atlantic salmon and Arctic charr. Phylogenetic analysis of this sequence against 35 Chlamydia and Chlamydia-like bacteria revealed that this novel bacterium belongs to an undescribed family lineage in the order Chlamydiales. Based on these observations, we propose this bacterium of yellowtail kingfish be known as "Candidatus Parilichlamydia carangidicola" and that the new family be known as "Candidatus Parilichlamydiaceae."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Histological analysis of gill samples taken from individuals of Latris lineata reared in aquaculture in Tasmania, Australia, and those sampled from the wild revealed the presence of epitheliocystis-like basophilic inclusions. Subsequent morphological, in situ hybridization, and molecular analyses were performed to confirm the presence of this disease and discovered a Chlamydia-like organism associated with this condition, and the criteria set by Fredericks and Relman's postulates were used to establish disease causation. Three distinct 16S rRNA genotypes were sequenced from 16 fish, and phylogenetic analyses of the nearly full-length 16S rRNA sequences generated for this bacterial agent indicated that they were nearly identical novel members of the order Chlamydiales. This new taxon formed a well-supported clade with "Candidatus Parilichlamydia carangidicola" from the yellowtail kingfish (Seriola lalandi). On the basis of sequence divergence over the 16S rRNA region relative to all other members of the order Chlamydiales, a new genus and species are proposed here for the Chlamydia-like bacterium from L. lineata, i.e., "Candidatus Similichlamydia latridicola" gen. nov., sp. nov.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning programming is known to be difficult. One possible reason why students fail programming is related to the fact that traditional learning in the classroom places more emphasis on lecturing the material instead of applying the material to a real application. For some students, this teaching model may not catch their interest. As a result they may not give their best effort to understand the material given. Seeing how the knowledge can be applied to real life problems can increase student interest in learning. As a consequence, this will increase their effort to learn. Anchored learning that applies knowledge to solve real life problems may be the key to improving student performance. In anchored learning, it is necessary to provide resources that can be accessed by the student as they learn. These resources can be provided by creating an Intelligent Tutoring System (ITS) that can support the student when they need help or experience a problem. Unfortunately, there is no ITS developed for the programming domain that has incorporated anchored learning in its teaching system. Having an ITS that supports anchored learning will not only be able to help the student learn programming effectively but will also make the learning process more enjoyable. This research tries to help students learn C# programming using an anchored learning ITS named CSTutor. Role playing is used in CSTutor to present a real world situation where they develop their skills. A knowledge base using First Order Logic is used to represent the student's code and to give feedback and assistance accordingly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The safety of passengers is a major concern to airports. In the event of crises, having an effective and efficient evacuation process in place can significantly aid in enhancing passenger safety. Hence, it is necessary for airport operators to have an in-depth understanding of the evacuation process of their airport terminal. Although evacuation models have been used in studying pedestrian behaviour for decades, little research has been done in considering the evacuees’ group dynamics and the complexity of the environment. In this paper, an agent-based model is presented to simulate passenger evacuation process. Different exits were allocated to passengers based on their location and security level. The simulation results show that the evacuation time can be influenced by passenger group dynamics. This model also provides a convenient way to design airport evacuation strategy and examine its efficiency. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.