873 resultados para pacs: engineering mathematics and mathematical techniques
Resumo:
This investigation sought to explore the nature and extent of school mathematical difficulties among the dyslexic population. Anecdotal reports have suggested that many dyslexics may have difficulties in arithmetic, but few systematic studies have previously been undertaken. The literature pertaining to dyslexia and school mathematics respectively is reviewed. Clues are sought in studies of dyscalculia. These seem inadequate in accounting for dyslexics' reported mathematical difficulties. Similarities between aspects of language and mathematics are examined for underlying commonalities that may partially account for concomitant problems in mathematics, in individuals with a written language dysfunction. The performance of children taught using different mathematics work-schemes is assessed to ascertain if these are associated with differential levels of achievement that may be reflected in the dyslexic population few are found. Findings from studies designed to assess the relationship between written language failure and achievement in mathematics are reported. Study 1 reveals large correlational differences between subtest scores (Wechsler Intelligence Scale for Children, Wechsler, 1976) and three mathematics tests, for young dyslexics and children without literacy difficulties. However, few differences are found between levels of attainment, at this age (6 ½ - 9 years). Further studies indicate that, for dyslexics, achievement in school mathematics, may be independent of measured intelligence, as is the case with their literacy skills. Studies 3 and 4 reveal that dyslexics' performances on a range of school mathematical topics gets relatively worse compared with that of Controls (age range 8 - 17 years), as they get older. Extensive item analyses reveal many errors relating strongly to known deficits in the dyslexics' learning style - poor short-term memory, sequencing skills and verbal labelling strategies. Subgroups of dyslexics are identified on the basis of mathematical performance. Tentative explanations, involving alternative neuropsychological approaches, are offered for the measured differences in attainment between these groups.
Resumo:
List of Participants
Resumo:
In October 1997 we celebrate the fiftieth anniversary of the founding of the Institute for Mathematics and Informatics (IMI) of the Bulgarian Academy of Sciences (BAS).
Resumo:
Report published in the Proceedings of the National Conference on "Education in the Information Society", Plovdiv, May, 2013
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
Math storybooks are picture books in which the understanding of mathematical concepts is central to the comprehension of the story. Math stories have provided useful opportunities for children to expand their skills in the language arts area and to talk about mathematical factors that are related to their real lives. The purpose of this study was to examine bilingual children's reading and math comprehension of the math storybooks. ^ The participants were randomly selected from two Korean schools and two public elementary schools in Miami, Florida. The sample consisted of 63 Hispanic American and 43 Korean American children from ages five to seven. A 2 x 3 x (2) mixed-model design with two between- and one within-subjects variable was used to conduct this study. The two between-subjects variables were ethnicity and age, and the within-subjects variable was the subject area of comprehension. Subjects were read the three math stories individually, and then they were asked questions related to reading and math comprehension. ^ The overall ANOVA using multivariate tests was conducted to evaluate the factor of subject area for age and ethnicity. As follow-up tests for a significant main effect and a significant interaction effect, pairwise comparisons and simple main effect tests were conducted, respectively. ^ The results showed that there were significant ethnicity and age differences in total comprehension scores. There were also age differences in reading and math comprehension, but no significant differences were found in reading and math by ethnicity. Korean American children had higher scores in total comprehension than those of Hispanic American children, and they showed greater changes in their comprehension skills at the younger ages, from five to six, whereas Hispanic American children showed greater changes at the older ages, from six to seven. Children at ages five and six showed higher scores in reading than in math, but no significant differences between math and reading comprehension scores were found at age seven. ^ Through schooling with integrated instruction, young bilingual children can move into higher levels of abstraction and concepts. This study highlighted bilingual children's general nature of thinking and showed how they developed reading and mathematics comprehension in an integrated process. ^
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^
Resumo:
Cancer comprises a collection of diseases, all of which begin with abnormal tissue growth from various stimuli, including (but not limited to): heredity, genetic mutation, exposure to harmful substances, radiation as well as poor dieting and lack of exercise. The early detection of cancer is vital to providing life-saving, therapeutic intervention. However, current methods for detection (e.g., tissue biopsy, endoscopy and medical imaging) often suffer from low patient compliance and an elevated risk of complications in elderly patients. As such, many are looking to “liquid biopsies” for clues into presence and status of cancer due to its minimal invasiveness and ability to provide rich information about the native tumor. In such liquid biopsies, peripheral blood is drawn from patients and is screened for key biomarkers, chiefly circulating tumor cells (CTCs). Capturing, enumerating and analyzing the genetic and metabolomic characteristics of these CTCs may hold the key for guiding doctors to better understand the source of cancer at an earlier stage for more efficacious disease management.
The isolation of CTCs from whole blood, however, remains a significant challenge due to their (i) low abundance, (ii) lack of a universal surface marker and (iii) epithelial-mesenchymal transition that down-regulates common surface markers (e.g., EpCAM), reducing their likelihood of detection via positive selection assays. These factors potentiate the need for an improved cell isolation strategy that can collect CTCs via both positive and negative selection modalities as to avoid the reliance on a single marker, or set of markers, for more accurate enumeration and diagnosis.
The technologies proposed herein offer a unique set of strategies to focus, sort and template cells in three independent microfluidic modules. The first module exploits ultrasonic standing waves and a class of elastomeric particles for the rapid and discriminate sequestration of cells. This type of cell handling holds promise not only in sorting, but also in the isolation of soluble markers from biofluids. The second module contains components to focus (i.e., arrange) cells via forces from acoustic standing waves and separate cells in a high throughput fashion via free-flow magnetophoresis. The third module uses a printed array of micromagnets to capture magnetically labeled cells into well-defined compartments, enabling on-chip staining and single cell analysis. These technologies can operate in standalone formats, or can be adapted to operate with established analytical technologies, such as flow cytometry. A key advantage of these innovations is their ability to process erythrocyte-lysed blood in a rapid (and thus high throughput) fashion. They can process fluids at a variety of concentrations and flow rates, target cells with various immunophenotypes and sort cells via positive (and potentially negative) selection. These technologies are chip-based, fabricated using standard clean room equipment, towards a disposable clinical tool. With further optimization in design and performance, these technologies might aid in the early detection, and potentially treatment, of cancer and various other physical ailments.
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
Resumo:
Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.
Resumo:
The etiological agent of maize white spot (MWS) disease has been a subject of controversy and discussion. Initially the disease was described as Phaeosphaeria leaf spot caused by Phaeosphaeria maydis. Other authors have Suggested the existence of different fungal species causing similar symptoms. Recently, a bacterium, Pantoea ananatis, was described as the causal agent of this disease. The purpose of this Study was to offer additional information on the correct etiology of this disease by providing visual evidence of the presence of the bacterium in the interior of the MWS lesions by using transmission electron microscopy (TEM) and molecular techniques. The TEM allowed Visualization of a large amount of bacteria in the intercellular spaces of lesions collected from both artificially and naturally infected plants. Fungal structures were not visualized in young lesions. Bacterial primers for the 16S rRNA and rpoB genes were used in PCR reactions to amplify DNA extracted from water-soaked (young) and necrotic lesions. The universal fungal oligonucleotide ITS4 was also included to identity the possible presence of fungal structures inside lesions. Positive PCR products from water-soaked lesions, both from naturally and artificially inoculated plants, were produced with bacterial primers, whereas no amplification was observed when ITS4 oligonucleotide was used. On the other hand, DNA amplification with ITS4 primer was observed when DNA was isolated from necrotic (old) lesions. These results reinforced previous report of P. ananatis as the primary pathogen and the hypothesis that fungal species may colonize lesions pre-established by P. ananatis.
Resumo:
Experimental mechanical sieving methods are applied to samples of shellfish remains from three sites in southeast Queensland, Seven Mile Creek Mound, Sandstone Point and One-Tree, to test the efficacy of various recovery and quantification procedures commonly applied to shellfish assemblages in Australia. There has been considerable debate regarding the most appropriate sieve sizes and quantification methods that should be applied in the recovery of vertebrate faunal remains. Few studies, however, have addressed the impact of recovery and quantification methods on the interpretation of invertebrates, specifically shellfish remains. In this study, five shellfish taxa representing four bivalves (Anadara trapezia, Trichomya hirsutus, Saccostrea glomerata, Donax deltoides) and one gastropod (Pyrazus ebeninus) common in eastern Australian midden assemblages are sieved through 10mm, 6.3mm and 3.15mm mesh. Results are quantified using MNI, NISP and weight. Analyses indicate that different structural properties and pre- and postdepositional factors affect recovery rates. Fragile taxa (T. hirsutus) or those with foliated structure (S. glomerata) tend to be overrepresented by NISP measures in smaller sieve fractions, while more robust taxa (A. trapezia and P. ebeninus) tend to be overrepresented by weight measures. Results demonstrate that for all quantification methods tested a 3mm sieve should be used on all sites to allow for regional comparability and to effectively collect all available information about the shellfish remains.
Impact of Commercial Search Engines and International Databases on Engineering Teaching and Research
Resumo:
For the last three decades, the engineering higher education and professional environments have been completely transformed by the "electronic/digital information revolution" that has included the introduction of personal computer, the development of email and world wide web, and broadband Internet connections at home. Herein the writer compares the performances of several digital tools with traditional library resources. While new specialised search engines and open access digital repositories may fill a gap between conventional search engines and traditional references, these should be not be confused with real libraries and international scientific databases that encompass textbooks and peer-reviewed scholarly works. An absence of listing in some Internet search listings, databases and repositories is not an indication of standing. Researchers, engineers and academics should remember these key differences in assessing the quality of bibliographic "research" based solely upon Internet searches.
Resumo:
The aim of this work was to exemplify the specific contribution of both two- and three-dimensional (31)) X-ray computed tomography to characterise earthworm burrow systems. To achieve this purpose we used 3D mathematical morphology operators to characterise burrow systems resulting from the activity of an anecic (Aporrectodea noctunia), and an endogeic species (Allolobophora chlorotica), when both species were introduced either separately or together into artificial soil cores. Images of these soil cores were obtained using a medical X-ray tomography scanner. Three-dimensional reconstructions of burrow systems were obtained using a specifically developed segmentation algorithm. To study the differences between burrow systems, a set of classical tools of mathematical morphology (granulometries) were used. So-called granulometries based on different structuring elements clearly separated the different burrow systems. They enabled us to show that burrows made by the anecic species were fatter, longer, more vertical, more continuous but less sinuous than burrows of the endogeic species. The granulometry transform of the soil matrix showed that burrows made by A. nocturna were more evenly distributed than those of A. chlorotica. Although a good discrimination was possible when only one species was introduced into the soil cores, it was not possible to separate burrows of the two species from each other in cases where species were introduced into the same soil core. This limitation, partly due to the insufficient spatial resolution of the medical scanner, precluded the use of the morphological operators to study putative interactions between the two species.