22 resultados para Sample algorithms

em Brock University, Canada


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main focus of this thesis is to evaluate and compare Hyperbalilearning algorithm (HBL) to other learning algorithms. In this work HBL is compared to feed forward artificial neural networks using back propagation learning, K-nearest neighbor and 103 algorithms. In order to evaluate the similarity of these algorithms, we carried out three experiments using nine benchmark data sets from UCI machine learning repository. The first experiment compares HBL to other algorithms when sample size of dataset is changing. The second experiment compares HBL to other algorithms when dimensionality of data changes. The last experiment compares HBL to other algorithms according to the level of agreement to data target values. Our observations in general showed, considering classification accuracy as a measure, HBL is performing as good as most ANn variants. Additionally, we also deduced that HBL.:s classification accuracy outperforms 103's and K-nearest neighbour's for the selected data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microwave digestions of mercury in Standards Reference Material (SRM) coal samples with nitric acid and hydrogen peroxide in quartz vessels were compared with Teflon® vessel digestion by using flow injection cold vapor atomic absorption spectrometry. Teflon® vessels gave poor reproducibiUty and tended to deliver high values, while the digestion results from quartz vessel show good agreement with certificate values and better standard deviations. Trace level elements (Ag, Ba, Cd, Cr, Co, Cu, Fe, Mg, Mn, Mo, Pb, Sn, Ti, V and Zn) in used oil and residual oil samples were determined by inductively coupled plasma-optical emission spectrometry. Different microwave digestion programs were developed for each sample and most of the results are in good agreement with certified values. The disagreement with values for Ag was due to the precipitation of Ag in sample; while Sn, V and Zn values had good recoveries from the spike test, which suggests that these certified values might need to be reconsidered. Gold, silver, copper, cadmium, cobalt, nickel and zinc were determined by continuous hydride generation inductively coupled plasma-optical emission spectrometry. The performance of two sample introduction systems: MSIS™ and gas-liquid separator were compared. Under the respective optimum conditions, MSIS^"^ showed better sensitivity and lower detection limits for Ag, Cd, Cu, Co and similar values for Au, Ni and Zn to those for the gas-liquid separator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nature of this research is to investigate paleoseismic deformation of glacial soft sediments from three sampling sites throughout the Scottish Highlands; Arrat's Mills, Meikleour and Glen Roy. The paleoseismic evidence investigated in this research will provide a basis for applying criteria to soft sediment deformation structures, and the trigger mechanisms that create these structures. Micromorphology is the tool used in this to investigate paleoseismic deformation structures in thin section. Thin section analysis, (micromorphology) of glacial sediments from the three sampling sites is used to determine microscale evidence of past earthquakes that can be correlated to modem-day events and possibly lead to a better understanding of the impact of earthquakes throughout a range of sediment types. The significance of the three sampling locations is their proximity to two major active fault zones that cross Scotland. The fault zones are the Highland Boundary Fault and the Great Glen Fault, these two major faults that parallel each other and divide the country in half Sims (1975) used a set of seven criteria that identified soft sediment deformation structures created by a magnitude six earthquake in Cahfomia. Using criteria set forth by Sims (1975), the paleoseismic evidence can be correlated to the magnitude of the deformation structures found in the glacial sediments. This research determined that the microstructures at Arrat's Mill, Meikleour and Glen Roy are consistent with a seismically induced origin. It has also been demonstrated that, even without the presence of macrostructures, the use of micromorphology techniques in detecting such activity within sediments is of immense value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of sample solvent composition and the injection volume, on the chromatographic peak profiles of two carbamate derivatives, methyl 2-benzimidazolecarbamate (MBC) and 3-butyl-2,4-dioxo[1,2-a]-s-triazinobenzimidazole (STB), were studied using reverse phase high performance liquid chromatograph. The study examined the effects of acetonitrile percentage in the sample solvent from 5 to 50%, effects of methanol percentage from 5 to 50%, effects of pH increase from 4.42 to 9.10, and effect of increasing buffer concentration from ° to 0.12M. The effects were studied at constant and increasing injection mass and at four injection volumes of 10, 50, 100 and 200 uL. The study demonstrated that the amount and the type of the organic solvents, the pH, and the buffer strength of the sample solution can have a pronounced effect on the peak heights, peak widths, and retention times of compounds analysed. MBC, which is capable of intramolecular hydrogen bonding and has no tendency to ionize, showed a predictable increase .in band broadening and a decrease in retention times at higher eluting strengths of the sample solvent. STB, which has a tendency to ionize or to strongly interact with the sample solvent, was influenced in various ways by the changes in ths sample solvent composition. The sample solvent effects became more pronounced as the injection volume increased and as the percentage of organic solvent in the sample solution became greater. The peak height increases for STB at increasing buffer concentrations became much more pronounced at higher analyte concentrations. It was shown that the widely accepted procedure of dissolving samples in the mobile phase does not yield the most efficient chromatograms. For that reason samples should be dissolved in the solutions with higher aqueous content than that of the mobile phase whenever possible. The results strongly recommend that all the samples and standards, regardless whether the standards are external or internal, be analysed at a constant sample composition and a constant injection volume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2003, prostate cancer (PCa) is estimated to be the most commonly diagnosed cancer and third leading cause of cancer death in Canada. During PCa population screening, approximately 25% of patients with a normal digital rectal examination (DRE) and intermediate serum prostate specific antigen (PSA) level have PCa. Since all patients typically undergo biopsy, it is expected that approximately 75% of these procedures are unnecessary. The purpose of this study was to compare the degree of efficacy of clinical tests and algorithms in stage II screening for PCa while preventing unnecessary biopsies from occurring. The sample consisted of 201 consecutive men who were suspected of PCa based on the results of a DRE and serum PSA. These men were referred for venipuncture and transrectal ultrasound (TRUS). Clinical tests included TRUS, agespecific reference range PSA (Age-PSA), prostate specific antigen density (PSAD), and free-to-total prostate specific antigen ratio (%fPSA). Clinical results were evaluated individually and within algorithms. Cutoffs of 0.12 and 0.15 ng/ml/cc were employed for PSAD. Cutoffs that would provide a minimum sensitivity of 0.90 and 0.95, respectively were utilized for %fPSA. Statistical analysis included ROC curve analysis, calculated sensitivity (Sens), specificity (Spec), and positive likelihood ratio (LR), with corresponding confidence intervals (Cl). The %fPSA, at a 23% cutoff ({ Sens=0.92; CI, 0.06}, {Spec=0.4l; CI, 0.09}, {LR=1.56; CI, O.ll}), proved to be the most efficacious independent clinical test. The combination of PSAD (cutoff 0.15 ng/ml/cc) and %fPSA (cutoff 23%) ({Sens=0.93; CI, 0.06}, {Spec=0.38; CI, 0.08}, {LR=1.50; CI, 0.10}) was the most efficacious clinical algorithm. This study advocates the use of %fPSA at a cutoff of 23% when screening patients with an intermediate serum PSA and benign DRE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated, retrospectively, whether recidivism in a sample of court-ordered'graduates of an alcohol education and awareness program could be predicted. This alcohol education program was based on adult education principles and was philosophically akin to the thoughts of Drs. Jack Mezirow, Stephen Brookfield, and Patricia Cranton. Data on the sample of 214 Halton IDEA (Impaired Driver Education and Awareness) graduates were entered into a spread sheet. Descriptive statistics were generated. Each of the 214 program graduates had taken several tests during the course of the IDEA program. These tests measured knowledge, attitude about impaired driving, and degree of alcohol involvement. Test scores were analyzed to determine whether those IDEA graduates who recidivated differed in any measurable way from those who had no further criminal convictions after a period of at least three years. Their criminal records were obtained from the Canadian Police Information Centre (CPIC). Those program graduates who reoffended were compared to the vast majority who did not reoffend. Results of the study indicated that there was no way to determine who would recidivate from the data that were collected. Further studies could use a qualitative model. Follow-up interviews could be used to determine what impact, if any, attendance at the IDEA program had on the life of the graduates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The (n, k)-star interconnection network was proposed in 1995 as an attractive alternative to the n-star topology in parallel computation. The (n, k )-star has significant advantages over the n-star which itself was proposed as an attractive alternative to the popular hypercube. The major advantage of the (n, k )-star network is its scalability, which makes it more flexible than the n-star as an interconnection network. In this thesis, we will focus on finding graph theoretical properties of the (n, k )-star as well as developing parallel algorithms that run on this network. The basic topological properties of the (n, k )-star are first studied. These are useful since they can be used to develop efficient algorithms on this network. We then study the (n, k )-star network from algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms for basic communication, prefix computation, and sorting, etc. A literature review of the state-of-the-art in relation to the (n, k )-star network as well as some open problems in this area are also provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioinformatics applies computers to problems in molecular biology. Previous research has not addressed edit metric decoders. Decoders for quaternary edit metric codes are finding use in bioinformatics problems with applications to DNA. By using side effect machines we hope to be able to provide efficient decoding algorithms for this open problem. Two ideas for decoding algorithms are presented and examined. Both decoders use Side Effect Machines(SEMs) which are generalizations of finite state automata. Single Classifier Machines(SCMs) use a single side effect machine to classify all words within a code. Locking Side Effect Machines(LSEMs) use multiple side effect machines to create a tree structure of subclassification. The goal is to examine these techniques and provide new decoders for existing codes. Presented are ideas for best practices for the creation of these two types of new edit metric decoders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The (n, k)-arrangement interconnection topology was first introduced in 1992. The (n, k )-arrangement graph is a class of generalized star graphs. Compared with the well known n-star, the (n, k )-arrangement graph is more flexible in degree and diameter. However, there are few algorithms designed for the (n, k)-arrangement graph up to present. In this thesis, we will focus on finding graph theoretical properties of the (n, k)- arrangement graph and developing parallel algorithms that run on this network. The topological properties of the arrangement graph are first studied. They include the cyclic properties. We then study the problems of communication: broadcasting and routing. Embedding problems are also studied later on. These are very useful to develop efficient algorithms on this network. We then study the (n, k )-arrangement network from the algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms such as prefix sums computation, sorting, merging and basic geometry computation: finding convex hull on the (n, k )-arrangement graph. A literature review of the state-of-the-art in relation to the (n, k)-arrangement network is also provided, as well as some open problems in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hyper-star interconnection network was proposed in 2002 to overcome the drawbacks of the hypercube and its variations concerning the network cost, which is defined by the product of the degree and the diameter. Some properties of the graph such as connectivity, symmetry properties, embedding properties have been studied by other researchers, routing and broadcasting algorithms have also been designed. This thesis studies the hyper-star graph from both the topological and algorithmic point of view. For the topological properties, we try to establish relationships between hyper-star graphs with other known graphs. We also give a formal equation for the surface area of the graph. Another topological property we are interested in is the Hamiltonicity problem of this graph. For the algorithms, we design an all-port broadcasting algorithm and a single-port neighbourhood broadcasting algorithm for the regular form of the hyper-star graphs. These algorithms are both optimal time-wise. Furthermore, we prove that the folded hyper-star, a variation of the hyper-star, to be maixmally fault-tolerant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hub location problem is an NP-hard problem that frequently arises in the design of transportation and distribution systems, postal delivery networks, and airline passenger flow. This work focuses on the Single Allocation Hub Location Problem (SAHLP). Genetic Algorithms (GAs) for the capacitated and uncapacitated variants of the SAHLP based on new chromosome representations and crossover operators are explored. The GAs is tested on two well-known sets of real-world problems with up to 200 nodes. The obtained results are very promising. For most of the test problems the GA obtains improved or best-known solutions and the computational time remains low. The proposed GAs can easily be extended to other variants of location problems arising in network design planning in transportation systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study sought to compare the results of the Motivation Assessment Scale (MAS; Durand & Crimmins, 1988), Questions About Behavior Function Scale (QABF; Matson & Vollmer, 1996) and Functional Analysis Screening Tool (FAST; Iwata & Deleon, 1996), when completed by parent informants in a sample of children and youth with autism spectrum disorders (ASD) who display challenging behaviour. Results indicated that there was low agreement between the functional hypotheses derived from each of three measures. In addition, correlations between functionally analogous scales were substantially lower than expected, while correlations between non-analogous subscales were stronger than anticipated. As indicated by this study, clinicians choosing to use FBA questionnaires to assess behavioural function, may not obtain accurate functional hypotheses, potentially resulting in ineffective intervention plans. The current study underscores the caution that must be taken when asking parents to complete these questionnaires to determine the function(s) of challenging behaviour for children/youth with ASD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research focused on the pathways through which the symptoms of posttraumatic stress disorder (PTSD) may negatively impact intimacy. Previous research has confirmed a link between self-reported PTSD symptoms and intimacy; however, a thorough examination of mediating paths, partner effects, and secondary traumatization has not yet been realized. With a sample of 297 heterosexual couples, intraindividual and dyadic models were developed to explain the relationships between PTSD symptoms and intimacy in the context of interdependence theory, attachment theory, and models of selfpreservation (e.g., fight-or-flight). The current study replicated the findings of others and has supported a process in which affective (alexithymia, negative affect, positive affect) and communication (demand-withdraw behaviour, self-concealment, and constructive communication) pathways mediate the intraindividual and dyadic relationships between PTSD symptoms and intimacy. Moreover, it also found that the PTSD symptoms of each partner were significantly related; however, this was only the case for those dyads in which the partners had disclosed most everything about their traumatic experiences. As such, secondary traumatization was supported. Finally, although the overall pattern of results suggest a total negative effect of PTSD symptoms on intimacy, a sex difference was evident such that the direct effect of the woman's PTSD symptoms were positively associated with both her and her partner's intimacy. I t is possible that the Tend-andBefriend model of threat response, wherein women are said to foster social bonds in the face of distress, may account for this sex difference. Overall, however, it is clear that PTSD symptoms were negatively associated with relationship quality and attention to this impact in the development of diagnostic criteria and treatment protocols is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hub Location Problems play vital economic roles in transportation and telecommunication networks where goods or people must be efficiently transferred from an origin to a destination point whilst direct origin-destination links are impractical. This work investigates the single allocation hub location problem, and proposes a genetic algorithm (GA) approach for it. The effectiveness of using a single-objective criterion measure for the problem is first explored. Next, a multi-objective GA employing various fitness evaluation strategies such as Pareto ranking, sum of ranks, and weighted sum strategies is presented. The effectiveness of the multi-objective GA is shown by comparison with an Integer Programming strategy, the only other multi-objective approach found in the literature for this problem. Lastly, two new crossover operators are proposed and an empirical study is done using small to large problem instances of the Civil Aeronautics Board (CAB) and Australian Post (AP) data sets.