895 resultados para Teaching of mathematics. Combinatorial analysis. Heuristic analysis of combinatorial problems
Resumo:
This paper presents a historical examination of employment in old age in Spain, in order to characterize this labour segment and identify and analyse its specific problems. One of these problems is the life-cycle deskilling process, already shown for certain national cases. This study explores whether this hypothesis also holds in Spain. The perspective used is essentially quantitative, as our analysis is based on the age-profession tables in Spanish population censuses from 1900 to 1970.
Resumo:
Background Medication adherence is a complex, dynamic and changing behaviour that is affected by a variety of factors, including the patient's beliefs and life circumstances. Studies have highlighted barriers to medication adherence (e.g., unmanaged side effects or a lack of social support), as well as facilitators of medication adherence (e.g., technical simplicity of treatment and psychological acceptance of the disease). Since August 2004, in Lausanne (Switzerland), physicians have referred patients who are either experiencing or are at risk of experiencing problems with their HIV antiretroviral treatment (ART) to a routine interdisciplinary ART adherence programme. This programme consists of multifactorial intervention including electronic drug monitoring (MEMS(TM)). Objective This study's objective was to identify the barriers and facilitators encountered by HIV patients with suboptimal medication adherence (≤90 % adherence over the study period). Setting The community pharmacy of the Department of Ambulatory Care and Community Medicine in Lausanne (Switzerland). Method The study consisted of a retrospective, qualitative, thematic content analysis of pharmacists' notes that were taken during semi-structured interviews with patients and conducted as part of the ART adherence programme between August 2004 and May 2008. Main outcome measure Barriers and facilitators encountered by HIV patients. Results Barriers to and facilitators of adherence were identified for the 17 included patients. These factors fell into three main categories: (1) cognitive, emotional and motivational; (2) environmental, organisational and social; and (3) treatment and disease. Conclusion The pharmacists' notes revealed that diverse barriers and facilitators were discussed during medication adherence interviews. Indeed, the results showed that the 17 non-adherent patients encountered barriers and benefited from facilitators. Therefore, pharmacists should inquire about all factors, regardless of whether they have a negative or a positive impact on medication adherence, and should consider all dimensions of patient adherence. The simultaneous strengthening of facilitators and better management of barriers may allow healthcare providers to tailor care to a patient's specific needs and support each individual patient in improving his medication-related behaviour.
Resumo:
Autopsy-negative sudden cardiac deaths (SCD) seen in forensic practice are most often thought to be the result of sudden arrhythmic death syndrome. Postmortem genetic analysis is recommended in such cases, but is currently performed in only a few academic centers. In order to determine actual current practice, an on-line questionnaire was sent by e-mail to members of various forensic medical associations. The questions addressed routine procedures employed in cases of sudden cardiac death (autopsy ordering, macroscopic and microscopic cardiac examination, conduction tissue examination, immunohistochemistry and electron microscopy, biochemical markers, sampling and storage of material for genetic analyses, toxicological analyses, and molecular autopsy). Some questions concerned the legal and ethical aspects of genetic analyses in postmortem examinations, as well as any existing multidisciplinary collaborations in SCD cases. There were 97 respondents, mostly from European countries. Genetic testing in cases of sudden cardiac death is rarely practiced in routine forensic investigation. Approximately 60% of respondents reported not having the means to perform genetic postmortem testing and 40% do not collect adequate material to perform these investigations at a later date, despite working at university hospitals. The survey demonstrated that many of the problems involved in the adequate investigation of SCD cases are often financial in origin, due to the fact that activities in forensic medicine are often paid by and dependent on the judicial authorities. Problems also exist concerning the contact with family members and/or the family doctor, as well as the often-nonexistent collaboration with others clinicians with special expertise beneficial in the investigation of SCD cases, such as cardiologists and geneticists. This study highlights the importance in establishing guidelines for molecular autopsies in forensic medicine.
Resumo:
About 50% of living species are holometabolan insects. Therefore, unraveling the ori- gin of insect metamorphosis from the hemimetabolan (gradual metamorphosis) to the holometabolan (sudden metamorphosis at the end of the life cycle) mode is equivalent to explaining how all this biodiversity originated. One of the problems with studying the evolution from hemimetaboly to holometaboly is that most information is available only in holometabolan species. Within the hemimetabolan group, our model, the cock- roach Blattella germanica, is the most studied species. However, given that the study of adult morphogenesis at organismic level is still complex, we focused on the study of the tergal gland (TG) as a minimal model of metamorphosis. The TG is formed in tergites 7 and 8 (T7-8) in the last days of the last nymphal instar (nymph 6). The comparative study of four T7-T8 transcriptomes provided us with crucial keys of TG formation, but also essential information about the mechanisms and circuitry that allows the shift from nymphal to adult morphogenesis.
Resumo:
Contemporary public administrations have become increasingly more complex, having to cordinate actions with emerging actors in the public and the private spheres. In this scenario the modern ICTs have begun to be seen as an ideal vehicle to resolve some of the problems of public administration. We argue that there is a clear need to explore the extent to which public administrations are undergoing a process of transformation towards a netowork government linked to the systematic incorporation of ICTs in their basic activities. Through critically analysing a selection of e-government evaluation reports, we conclude that research should be carried out if we are to build a solid government assessment framework based on network-like organisation characteristics.
Resumo:
In this study we analize the application of the reflective learning during initial formation mathematics teachers. This model is based on the sociocultural theories of the human learning and assumes that the interaction and the contrast make possible the coconstruction and the active reconstruction of knowledge.In order to make the study, it was left from a sample of 29 teaching students. The qualitative analysis allowed to identify factors that facilitate the incorporation of the reflective learning in university teaching, as well as the degree of effectiveness of this model to learn to teach mathematics
Resumo:
When laboratory intercomparison exercises are conducted, there is no a priori dependence of the concentration of a certain compound determined in one laboratory to that determined by another(s). The same applies when comparing different methodologies. A existing data set of total mercury readings in fish muscle samples involved in a Brazilian intercomparison exercise was used to show that correlation analysis is the most effective statistical tool in this kind of experiments. Problems associated with alternative analytical tools such as mean or paired 't'-test comparison and regression analysis are discussed.
Resumo:
This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.
Resumo:
By coupling the Boundary Element Method (BEM) and the Finite Element Method (FEM) an algorithm that combines the advantages of both numerical processes is developed. The main aim of the work concerns the time domain analysis of general three-dimensional wave propagation problems in elastic media. In addition, mathematical and numerical aspects of the related BE-, FE- and BE/FE-formulations are discussed. The coupling algorithm allows investigations of elastodynamic problems with a BE- and a FE-subdomain. In order to observe the performance of the coupling algorithm two problems are solved and their results compared to other numerical solutions.
Resumo:
ABSTRACT:The section “Lordship and Bondage” in Hegel’s Phenomenology of Spirit offers us, through the criticism of slavery, some indications regarding Hegel’s conception of human nature. In this paper some consequences of this conception for Hegel’s political philosophy are identified and presented. The analysis shows problems may emerge when we analyze some fundamental Hegelian concepts – “recognition” and shows that some “men” – if we take into consideration the way these concepts were defined in the master-slave dialectic. In light of these problems it is pointed out that Hegel’s political philosophy, and also his position regarding slavery, become less cogent and more susceptible to criticism. The last part of the text analyzes some consequences of problems related to the possibility of defining the concepts “recognition” and “men” in terms of Hegel’s model of state.
Resumo:
In Canada freedom of information must be viewed in the context of governing -- how do you deal with an abundance of information while balancing a diversity of competing interests? How can you ensure people are informed enough to participate in crucial decision-making, yet willing enough to let some administrative matters be dealt with in camera without their involvement in every detail. In an age when taxpayers' coalition groups are on the rise, and the government is encouraging the establishment of Parent Council groups for schools, the issues and challenges presented by access to information and protection of privacy legislation are real ones. The province of Ontario's decision to extend freedom of information legislation to local governments does not ensure, or equate to, full public disclosure of all facts or necessarily guarantee complete public comprehension of an issue. The mere fact that local governments, like school boards, decide to collect, assemble or record some information and not to collect other information implies that a prior decision was made by "someone" on what was important to record or keep. That in itself means that not all the facts are going to be disclosed, regardless of the presence of legislation. The resulting lack of information can lead to public mistrust and lack of confidence in those who govern. This is completely contrary to the spirit of the legislation which was to provide interested members of the community with facts so that values like political accountability and trust could be ensured and meaningful criticism and input obtained on matters affecting the whole community. This thesis first reviews the historical reasons for adopting freedom of information legislation, reasons which are rooted in our parliamentary system of government. However, the same reasoning for enacting such legislation cannot be applied carte blanche to the municipal level of government in Ontario, or - ii - more specifially to the programs, policies or operations of a school board. The purpose of this thesis is to examine whether the Municipal Freedom of Information and Protection of Privacy Act, 1989 (MFIPPA) was a neccessary step to ensure greater openness from school boards. Based on a review of the Orders made by the Office of the Information and Privacy Commissioner/Ontario, it also assesses how successfully freedom of information legislation has been implemented at the municipal level of government. The Orders provide an opportunity to review what problems school boards have encountered, and what guidance the Commissioner has offered. Reference is made to a value framework as an administrative tool in critically analyzing the suitability of MFIPPA to school boards. The conclusion is drawn that MFIPPA appears to have inhibited rather than facilitated openness in local government. This may be attributed to several factors inclusive of the general uncertainty, confusion and discretion in interpreting various provisions and exemptions in the Act. Some of the uncertainty is due to the fact that an insufficient number of school board staff are familiar with the Act. The complexity of the Act and its legalistic procedures have over-formalized the processes of exchanging information. In addition there appears to be a concern among municipal officials that granting any access to information may be violating personal privacy rights of others. These concerns translate into indecision and extreme caution in responding to inquiries. The result is delay in responding to information requests and lack of uniformity in the responses given. However, the mandatory review of the legislation does afford an opportunity to address some of these problems and to make this complex Act more suitable for application to school boards. In order for the Act to function more efficiently and effectively legislative changes must be made to MFIPPA. It is important that the recommendations for improving the Act be adopted before the government extends this legislation to any other public entities.
Resumo:
Computational Biology is the research are that contributes to the analysis of biological data through the development of algorithms which will address significant research problems.The data from molecular biology includes DNA,RNA ,Protein and Gene expression data.Gene Expression Data provides the expression level of genes under different conditions.Gene expression is the process of transcribing the DNA sequence of a gene into mRNA sequences which in turn are later translated into proteins.The number of copies of mRNA produced is called the expression level of a gene.Gene expression data is organized in the form of a matrix. Rows in the matrix represent genes and columns in the matrix represent experimental conditions.Experimental conditions can be different tissue types or time points.Entries in the gene expression matrix are real values.Through the analysis of gene expression data it is possible to determine the behavioral patterns of genes such as similarity of their behavior,nature of their interaction,their respective contribution to the same pathways and so on. Similar expression patterns are exhibited by the genes participating in the same biological process.These patterns have immense relevance and application in bioinformatics and clinical research.Theses patterns are used in the medical domain for aid in more accurate diagnosis,prognosis,treatment planning.drug discovery and protein network analysis.To identify various patterns from gene expression data,data mining techniques are essential.Clustering is an important data mining technique for the analysis of gene expression data.To overcome the problems associated with clustering,biclustering is introduced.Biclustering refers to simultaneous clustering of both rows and columns of a data matrix. Clustering is a global whereas biclustering is a local model.Discovering local expression patterns is essential for identfying many genetic pathways that are not apparent otherwise.It is therefore necessary to move beyond the clustering paradigm towards developing approaches which are capable of discovering local patterns in gene expression data.A biclusters is a submatrix of the gene expression data matrix.The rows and columns in the submatrix need not be contiguous as in the gene expression data matrix.Biclusters are not disjoint.Computation of biclusters is costly because one will have to consider all the combinations of columans and rows in order to find out all the biclusters.The search space for the biclustering problem is 2 m+n where m and n are the number of genes and conditions respectively.Usually m+n is more than 3000.The biclustering problem is NP-hard.Biclustering is a powerful analytical tool for the biologist.The research reported in this thesis addresses the problem of biclustering.Ten algorithms are developed for the identification of coherent biclusters from gene expression data.All these algorithms are making use of a measure called mean squared residue to search for biclusters.The objective here is to identify the biclusters of maximum size with the mean squared residue lower than a given threshold. All these algorithms begin the search from tightly coregulated submatrices called the seeds.These seeds are generated by K-Means clustering algorithm.The algorithms developed can be classified as constraint based,greedy and metaheuristic.Constarint based algorithms uses one or more of the various constaints namely the MSR threshold and the MSR difference threshold.The greedy approach makes a locally optimal choice at each stage with the objective of finding the global optimum.In metaheuristic approaches particle Swarm Optimization(PSO) and variants of Greedy Randomized Adaptive Search Procedure(GRASP) are used for the identification of biclusters.These algorithms are implemented on the Yeast and Lymphoma datasets.Biologically relevant and statistically significant biclusters are identified by all these algorithms which are validated by Gene Ontology database.All these algorithms are compared with some other biclustering algorithms.Algorithms developed in this work overcome some of the problems associated with the already existing algorithms.With the help of some of the algorithms which are developed in this work biclusters with very high row variance,which is higher than the row variance of any other algorithm using mean squared residue, are identified from both Yeast and Lymphoma data sets.Such biclusters which make significant change in the expression level are highly relevant biologically.
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes
Resumo:
The Swedish government has authorised the teaching of mathematics in English to Swedish speaking students. Much of that teaching is performed by foreign trained native English speaking teachers lacking training in second language learners. This systematic review summarises international studies from the last ten years that deal with the teaching of mathematics to second language learners. The review shows that second language students working in a bilingual environment achieve higher rates of content and language knowledge than learners in a monolingual environment. This study also summarises some of the teacher practices that are effective for teaching mathematics in English to second language learners.