823 resultados para Boolean Computations
Resumo:
This dissertation derived hypotheses from the theories of Piaget, Bruner and Dienes regarding the effects of using Algebra Tiles and other manipulative materials to teach remedial algebra to community college students. The dependent variables measured were achievement and attitude towards mathematics. The Piagetian cognitive level of the students in the study was measured and used as a concomitant factor in the study.^ The population for the study was comprised of remedial algebra students at a large urban community college. The sample for the study consisted of 253 students enrolled in 10 sections of remedial algebra at three of the six campuses of the college. Pretests included administration of an achievement pre-measure, Aiken's Mathematics Attitude Inventory (MAI), and the Group Assessment of Logical Thinking (GALT). Posttest measures included a course final exam and a second administration of the MAI.^ The results of the GALT test revealed that 161 students (63.6%) were concrete operational, 65 (25.7%) were transitional, and 27 (10.7%) were formal operational. For the purpose of analyzing the data, the transitional and formal operational students were grouped together.^ Univariate factorial analyses of covariance ($\alpha$ =.05) were performed on the posttest of achievement (covariate = achievement pretest) and the MAI posttest (covariate = MAI pretest). The factors used in the analysis were method of teaching (manipulative vs. traditional) and cognitive level (concrete operational vs. transitional/formal operational).^ The analyses for achievement revealed a significant difference in favor of the manipulatives groups in the computations by campus. Significant differences were not noted in the analysis by individual instructors.^ The results for attitude towards mathematics showed a significant difference in favor of the manipulatives groups for the college-wide analysis and for one campus. The analysis by individual instructor was not significant. In addition, the college-wide analysis was significant in favor of the transitional/formal operational stage of cognitive development. However, support for this conclusion was not obtained in the analyses by campus or individual instructor. ^
Resumo:
If we classify variables in a program into various security levels, then a secure information flow analysis aims to verify statically that information in a program can flow only in ways consistent with the specified security levels. One well-studied approach is to formulate the rules of the secure information flow analysis as a type system. A major trend of recent research focuses on how to accommodate various sophisticated modern language features. However, this approach often leads to overly complicated and restrictive type systems, making them unfit for practical use. Also, problems essential to practical use, such as type inference and error reporting, have received little attention. This dissertation identified and solved major theoretical and practical hurdles to the application of secure information flow. ^ We adopted a minimalist approach to designing our language to ensure a simple lenient type system. We started out with a small simple imperative language and only added features that we deemed most important for practical use. One language feature we addressed is arrays. Due to the various leaking channels associated with array operations, arrays have received complicated and restrictive typing rules in other secure languages. We presented a novel approach for lenient array operations, which lead to simple and lenient typing of arrays. ^ Type inference is necessary because usually a user is only concerned with the security types for input/output variables of a program and would like to have all types for auxiliary variables inferred automatically. We presented a type inference algorithm B and proved its soundness and completeness. Moreover, algorithm B stays close to the program and the type system and therefore facilitates informative error reporting that is generated in a cascading fashion. Algorithm B and error reporting have been implemented and tested. ^ Lastly, we presented a novel framework for developing applications that ensure user information privacy. In this framework, core computations are defined as code modules that involve input/output data from multiple parties. Incrementally, secure flow policies are refined based on feedback from the type checking/inference. Core computations only interact with code modules from involved parties through well-defined interfaces. All code modules are digitally signed to ensure their authenticity and integrity. ^
Resumo:
This research is to establish new optimization methods for pattern recognition and classification of different white blood cells in actual patient data to enhance the process of diagnosis. Beckman-Coulter Corporation supplied flow cytometry data of numerous patients that are used as training sets to exploit the different physiological characteristics of the different samples provided. The methods of Support Vector Machines (SVM) and Artificial Neural Networks (ANN) were used as promising pattern classification techniques to identify different white blood cell samples and provide information to medical doctors in the form of diagnostic references for the specific disease states, leukemia. The obtained results prove that when a neural network classifier is well configured and trained with cross-validation, it can perform better than support vector classifiers alone for this type of data. Furthermore, a new unsupervised learning algorithm---Density based Adaptive Window Clustering algorithm (DAWC) was designed to process large volumes of data for finding location of high data cluster in real-time. It reduces the computational load to ∼O(N) number of computations, and thus making the algorithm more attractive and faster than current hierarchical algorithms.
Resumo:
Purpose. The goal of this study is to improve the favorable molecular interactions between starch and PPC by addition of grafting monomers MA and ROM as compatibilizers, which would advance the mechanical properties of starch/PPC composites. ^ Methodology. DFT and semi-empirical methods based calculations were performed on three systems: (a) starch/PPC, (b) starch/PPC-MA, and (c) starch-ROM/PPC. Theoretical computations involved the determination of optimal geometries, binding-energies and vibrational frequencies of the blended polymers. ^ Findings. Calculations performed on five starch/PPC composites revealed hydrogen bond formation as the driving force behind stable composite formation, also confirmed by the negative relative energies of the composites indicating the existence of binding forces between the constituent co-polymers. The interaction between starch and PPC is also confirmed by the computed decrease in stretching CO and OH group frequencies participating in hydrogen bond formation, which agree qualitatively with the experimental values. ^ A three-step mechanism of grafting MA on PPC was proposed to improve the compatibility of PPC with starch. Nine types of 'blends' produced by covalent bond formation between starch and MA-grafted PPC were found to be energetically stable, with blends involving MA grafted at the 'B' and 'C' positions of PPC indicating a binding-energy increase of 6.8 and 6.2 kcal/mol, respectively, as compared to the non-grafted starch/PPC composites. A similar increase in binding-energies was also observed for three types of 'composites' formed by hydrogen bond formation between starch and MA-grafted PPC. ^ Next, grafting of ROM on starch and subsequent blend formation with PPC was studied. All four types of blends formed by the reaction of ROM-grafted starch with PPC were found to be more energetically stable as compared to the starch/PPC composite and starch/PPC-MA composites and blends. A blend of PPC and ROM grafted at the ' a&d12; ' position on amylose exhibited a maximal increase of 17.1 kcal/mol as compared with the starch/PPC-MA blend. ^ Conclusions. ROM was found to be a more effective compatibilizer in improving the favorable interactions between starch and PPC as compared to MA. The ' a&d12; ' position was found to be the most favorable attachment point of ROM to amylose for stable blend formation with PPC.^
Resumo:
This study investigated the influence that receiving instruction in two languages, English and Spanish, had on the performance of students enrolled in the International Studies Program (delayed partial immersion model) of Miami Dade County Public Schools on a standardized test in English, the Stanford Achievement Test, eighth edition, for three of its sections, Reading Comprehension, Mathematics Computations, and Mathematics Applications.^ The performance of the selected IS program/Spanish section cohort of students (N = 55) on the SAT Reading Comprehension, Mathematics Computation, and Mathematics Application along four consecutive years was contrasted with that of a control group of comparable students selected within the same feeder pattern where the IS program is implemented (N = 21). The performance of the group was also compared to the cross-sectional achievement patterns of the school's corresponding feeder pattern, region, and district.^ The research model for the study was a variation of the "causal-comparative" or "ex post facto design" sometimes referred to as "prospective". After data were collected from MDCPS, t-tests were performed to compare IS-Spanish students SAT performance for grades 3 to 6 for years 1994 to 1997 to control group, feeder pattern, region and district norms for each year for Reading Comprehension, Mathematics Computation, and Mathematics Applications. Repeated measures ANOVA and Tukey's tests were calculated to compare the mean percentiles of the groups under study and the possible interactions of the different variables. All tests were performed at the 5% significance level.^ From the analyses of the tests it was deduced that the IS group performed significantly better than the control group for all the three measures along the four years. The IS group mean percentiles on the three measures were also significantly higher than those of the feeder pattern, region, and district. The null hypotheses were rejected and it was concluded that receiving instruction in two languages did not negatively affect the performance of IS program students on tests taken in English. It was also concluded that the particular design the IS program enhances the general performance of participant students on Standardized tests.^ The quantitative analyses were coupled with interviews from teachers and administrators of the IS program to gain additional insight about different aspects of the implementation of the program at each particular school. ^
Resumo:
This work proposes the use of the behavioral model of the hysteresis loop of the ferroelectrics capacitor as a new alternative to the usually costly techniques in the computation of nonlinear functions in artificial neurons implemented on reconfigurable hardware platform, in this case, a FPGA device. Initially the proposal has been validated by the implementation of the boolean logic through the digital models of two artificial neurons: the Perceptron and a variation of the model Integrate and Fire Spiking Neuron, both using the model also digital of the hysteresis loop of the ferroelectric capacitor as it’s basic nonlinear unit for the calculations of the neurons outputs. Finally, it has been used the analog model of the ferroelectric capacitor with the goal of verifying it’s effectiveness and possibly the reduction of the number of necessary logic elements in the case of implementing the artificial neurons on integrated circuit. The implementations has been carried out by Simulink models and the synthesizing has been done through the DSP Builder software from Altera Corporation.
Resumo:
This work has as object the elaboration of social environmental indicator of disaster risk that are present in precarious areas of human occupation, related to intense environmental dynamic from the perspective of the studies about the subject in Geography. The District of Mãe Luiza in Natal, capital city of Rio Grande do Norte, was defined as the study area. The place was chosen because it presents –historically- several vulnerability conditions and exposure to disaster risk. After a local social environmental description, two indexes were elaborated: the Social Vulnerability Index (SVI or IVS in Portuguese), based on 17 (seventeen) variables arranged on a questionnaire addressed to the population nucleus of the district, on regular grid (systematic sampling), classified into 5 (five) levels of SV from the weighted average; and the Physical and Natural Exposure to the Mass Movements Index (EMMI or IEMM in Portuguese) which had 16 (sixteen) variables that feature conditions of exposure to the mass movements in the district with classified levels from the weighted average of 1 (one) to 5 (five). The relationship between these two results, specialized in the district map, produced the Social Environmental Vulnerability Index (SEVI or IVSA in Portuguese) of Mãe Luiza, also classified into 5 (five) levels, following the Boolean logic correlation for cartographic overlay with use of computer software ArcGIS v.9.3, being named as: Very Low; low; average; high; and Very High Environmental Vulnerability in District. The study is based on the methodology proposed by Guerra et al (2009) for EMMI and by Almeida (2010) for SVI. They were modified and adapted according to the local reality, producing a new methodology in this study area. It was concluded that the neighborhood has most of its area with High and Very High Socio-environmental vulnerability to disasters, defined seven (7) critical areas, with Very High IVSA, and hazards associated with mass movements or flooding. In the end, the main issues that were found, such as generating elements for proposing mitigation measures and/or the proposed interventions were enumerated, related to structural order of vulnerability factors: how low constructive pattern of households; poor urban drainage; Real Estate forsaken in landslide routes; infrastructure ready access roads and slope containment. And social: as a lack of education about environmental risk; income and education of residents; presence of persons with limited mobility and/or those with special needs. This reality highlights the need for urgent action applied to the resolution and/or reduction of these problems, which is focusing the end of this work.
Resumo:
The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.
Resumo:
R.N.P. and P.J.H. are grateful for funding from an NSERC Discovery Grant. Computations were performed on the GPC supercomputer at the SciNet HPC Consortium. SciNet is funded by the Canada Foundation for Innovation under the auspices of Compute Canada, the Government of Ontario, Ontario Research Fund—Research Excellence and the University of Toronto. Numerical calculations were done using a modified version of the SOPALE (2000) software. The SOPALE modelling code was originally developed by Philippe Fullsack at Dalhousie University with Chris Beaumont and his Geodynamics group.
Resumo:
A class of lifetime distributions which has received considerable attention in modelling and analysis of lifetime data is the class of lifetime distributions with bath-tub shaped failure rate functions because of their extensive applications. The purpose of this thesis was to introduce a new class of bivariate lifetime distributions with bath-tub shaped failure rates (BTFRFs). In this research, first we reviewed univariate lifetime distributions with bath-tub shaped failure rates, and several multivariate extensions of a univariate failure rate function. Then we introduced a new class of bivariate distributions with bath-tub shaped failure rates (hazard gradients). Specifically, the new class of bivariate lifetime distributions were developed using the method of Morgenstern’s method of defining bivariate class of distributions with given marginals. The computer simulations and numerical computations were used to investigate the properties of these distributions.
Resumo:
When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.
Resumo:
Ageing of the population is a worldwide phenomenon. Numerous ICT-based solutions have been developed for elderly care but mainly connected to the physiological and nursing aspects in services for the elderly. Social work is a profession that should pay attention to the comprehensive wellbeing and social needs of the elderly. Many people experience loneliness and depression in their old age, either as a result of living alone or due to a lack of close family ties and reduced connections with their culture of origin, which results in an inability to participate actively in community activities (Singh & Misra, 2009). Participation in society would enhance the quality of life. With the development of information technology, the use of technology in social work practice has risen dramatically. The aim of this literature review is to map out the state of the art of knowledge about the usage of ICT in elderly care and to figure out research-based knowledge about the usability of ICT for the prevention of loneliness and social isolation of elderly people. The data for the current research comes from the core collection of the Web of Science and the data searching was performed using Boolean? The searching resulted in 216 published English articles. After going through the topics and abstracts, 34 articles were selected for the data analysis that is based on a multi approach framework. The analysis of the research approach is categorized according to some aspects of using ICT by older adults from the adoption of ICT to the impact of usage, and the social services for them. This literature review focused on the function of communication by excluding the applications that mainly relate to physical nursing. The results show that the so-called ‘digital divide’ still exists, but the older adults have the willingness to learn and utilise ICT in daily life, especially for communication. The data shows that the usage of ICT can prevent the loneliness and social isolation of older adults, and they are eager for technical support in using ICT. The results of data analysis on theoretical frames and concepts show that this research field applies different theoretical frames from various scientific fields, while a social work approach is lacking. However, a synergic frame of applied theories will be suggested from the perspective of social work.
Resumo:
The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.
Resumo:
Context. The 30 Doradus (30 Dor) region of the Large Magellanic Cloud, also known as the Tarantula nebula, is the nearest starburst region. It contains the richest population of massive stars in the Local Group, and it is thus the best possible laboratory to investigate open questions on the formation and evolution of massive stars. Aims. Using ground-based multi-object optical spectroscopy obtained in the framework of the VLT-FLAMES Tarantula Survey (VFTS), we aim to establish the (projected) rotational velocity distribution for a sample of 216 presumably single O-type stars in 30 Dor. The sample is large enough to obtain statistically significant information and to search for variations among subpopulations - in terms of spectral type, luminosity class, and spatial location - in the field of view. Methods. We measured projected rotational velocities, 3e sin i, by means of a Fourier transform method and a profile fitting method applied to a set of isolated spectral lines. We also used an iterative deconvolution procedure to infer the probability density, P(3e), of the equatorial rotational velocity, 3e. Results. The distribution of 3e sin i shows a two-component structure: a peak around 80 km s1 and a high-velocity tail extending up to 600 km s-1 This structure is also present in the inferred distribution P(3e) with around 80% of the sample having 0 <3e ≤ 300 km s-1 and the other 20% distributed in the high-velocity region. The presence of the low-velocity peak is consistent with what has been found in other studies for late O- and early B-type stars. Conclusions. Most of the stars in our sample rotate with a rate less than 20% of their break-up velocity. For the bulk of the sample, mass loss in a stellar wind and/or envelope expansion is not efficient enough to significantly spin down these stars within the first few Myr of evolution. If massive-star formation results in stars rotating at birth with a large portion of their break-up velocities, an alternative braking mechanism, possibly magnetic fields, is thus required to explain the present-day rotational properties of the O-type stars in 30 Dor. The presence of a sizeable population of fast rotators is compatible with recent population synthesis computations that investigate the influence of binary evolution on the rotation rate of massive stars. Even though we have excluded stars that show significant radial velocity variations, our sample may have remained contaminated by post-interaction binary products. That the highvelocity tail may be populated primarily (and perhaps exclusively) by post-binary interaction products has important implications for the evolutionary origin of systems that produce gamma-ray bursts. © 2013 Author(s).