972 resultados para Science methodology
Resumo:
With the increasing emphasis on health and well-being, nutrition aspects need to be incorporated as a dimension of product development. Thus, the production of a high-fibre content snack food from a mixture of corn and flaxseed flours was optimized by response surface methodology. The independent variables considered in this study were: feed moisture, process temperature and flaxseed flour addition, as they were found to significantly impact the resultant product. These variables were studied according to a rotatable composite design matrix (-1.68, -1, 0, 1, 1.68). Response variable was the expansion ratio since it has been highly correlated with acceptability. The optimum corn-flaxseed snack obtained presented a sevenfold increase in dietary fibre, almost 100% increase in protein content compared to the pure corn snack, and yielded an acceptability score of 6.93. This acceptability score was similar to those observed for corn snack brands in the market, indicating the potential commercial use of this new product, which can help to increase the daily consumption of dietary fibre.
Resumo:
This paper presents a new parallel methodology for calculating the determinant of matrices of the order n, with computational complexity O(n), using the Gauss-Jordan Elimination Method and Chio's Rule as references. We intend to present our step-by-step methodology using clear mathematical language, where we will demonstrate how to calculate the determinant of a matrix of the order n in an analytical format. We will also present a computational model with one sequential algorithm and one parallel algorithm using a pseudo-code.
Resumo:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
Resumo:
This report shares my efforts in developing a solid unit of instruction that has a clear focus on student outcomes. I have been a teacher for 20 years and have been writing and revising curricula for much of that time. However, most has been developed without the benefit of current research on how students learn and did not focus on what and how students are learning. My journey as a teacher has involved a lot of trial and error. My traditional method of teaching is to look at the benchmarks (now content expectations) to see what needs to be covered. My unit consists of having students read the appropriate sections in the textbook, complete work sheets, watch a video, and take some notes. I try to include at least one hands-on activity, one or more quizzes, and the traditional end-of-unit test consisting mostly of multiple choice questions I find in the textbook. I try to be engaging, make the lessons fun, and hope that at the end of the unit my students get whatever concepts I‘ve presented so that we can move on to the next topic. I want to increase students‘ understanding of science concepts and their ability to connect understanding to the real-world. However, sometimes I feel that my lessons are missing something. For a long time I have wanted to develop a unit of instruction that I know is an effective tool for the teaching and learning of science. In this report, I describe my efforts to reform my curricula using the “Understanding by Design” process. I want to see if this style of curriculum design will help me be a more effective teacher and if it will lead to an increase in student learning. My hypothesis is that this new (for me) approach to teaching will lead to increased understanding of science concepts among students because it is based on purposefully thinking about learning targets based on “big ideas” in science. For my reformed curricula I incorporate lessons from several outstanding programs I‘ve been involved with including EpiCenter (Purdue University), Incorporated Research Institutions for Seismology (IRIS), the Master of Science Program in Applied Science Education at Michigan Technological University, and the Michigan Association for Computer Users in Learning (MACUL). In this report, I present the methodology on how I developed a new unit of instruction based on the Understanding by Design process. I present several lessons and learning plans I‘ve developed for the unit that follow the 5E Learning Cycle as appendices at the end of this report. I also include the results of pilot testing of one of lessons. Although the lesson I pilot-tested was not as successful in increasing student learning outcomes as I had anticipated, the development process I followed was helpful in that it required me to focus on important concepts. Conducting the pilot test was also helpful to me because it led me to identify ways in which I could improve upon the lesson in the future.
Resumo:
This commentary is based on a general concern regarding the low level of self-criticism (-evaluation) in the interpretation of molecular pharmacological data published in ethnopharmacology-related journals. Reports on potentially new lead structures or pharmacological effects of medicinal plant extracts are mushrooming. At the same time, nonsense in bioassays is an increasing phenomenon in herbal medicine research. Only because a dataset is reproducible does not imply that it is meaningful. Currently, there are thousands of claims of pharmacological effects of medicinal plants and natural products. It is argued that claims to knowledge in ethnopharmacology, as in the exact sciences, should be rationally criticized if they have empirical content as it is the case with biochemical and pharmacological analyses. Here the major problem is the misemployment of the concentration-effect paradigm and the overinterpretation of data obtained in vitro. Given the almost exponential increase of scientific papers published it may be the moment to adapt to a falsificationist methodology.
Resumo:
If we postulate a need for the transformation of society towards sustainable development, we also need to transform science and overcome the fact/value split that makes it impossible for science to be accountable to society. The orientation of this paradigm transformation in science has been under debate for four decades, generating important theoretical concepts, but they have had limited impact until now. This is due to a contradictory normative science policy framing that science has difficulties dealing with, not least of all because the dominant framing creates a lock-in. We postulate that in addition to introducing transdisciplinarity, science needs to strive for integration of the normative aspect of sustainable development at the meta-level. This requires a strategically managed niche within which scholars and practitioners from many different disciplines can engage in a long-term common learning process, in order to become a “thought collective” (Fleck) capable of initiating the paradigm transformation. Arguing with Piaget that “decentration” is essential to achieve normative orientation and coherence in a learning collective, we introduce a learning approach—Cohn's “Theme-Centred Interaction”—which provides a methodology for explicitly working with the objectivity and subjectivity of statements and positions in a “real-world” context, and for consciously integrating concerns of individuals in their interdependence with the world. This should enable a thought collective to address the epistemological and ethical barriers to science for sustainable development.
Resumo:
Clock synchronization in the order of nanoseconds is one of the critical factors for time-based localization. Currently used time synchronization methods are developed for the more relaxed needs of network operation. Their usability for positioning should be carefully evaluated. In this paper, we are particularly interested in GPS-based time synchronization. To judge its usability for localization we need a method that can evaluate the achieved time synchronization with nanosecond accuracy. Our method to evaluate the synchronization accuracy is inspired by signal processing algorithms and relies on fine grain time information. The method is able to calculate the clock offset and skew between devices with nanosecond accuracy in real time. It was implemented using software defined radio technology. We demonstrate that GPS-based synchronization suffers from remaining clock offset in the range of a few hundred of nanoseconds but the clock skew is negligible. Finally, we determine a corresponding lower bound on the expected positioning error.
Resumo:
Measuring the ratio of heterophils and lymphocytes (H/L) in response to different stressors is a standard tool for assessing long-term stress in laying hens but detailed information on the reliability of measurements, measurement techniques and methods, and absolute cell counts is often lacking. Laying hens offered different sites of the nest boxes at different ages were compared in a two-treatment crossover experiment to provide detailed information on the procedure for measuring and the difficulties in the interpretation of H/L ratios in commercial conditions. H/L ratios were pen-specific and depended on the age and aviary system. There was no effect for the position of the nest. Heterophiles and lymphocytes were not correlated within individuals. Absolute cell counts differed in the number of heterophiles and lymphocytes and H/L ratios, whereas absolute leucocyte counts between individuals were similar. The reliability of the method using relative cell counts was good, yielding a correlation coefficient between double counts of r > 0.9. It was concluded that population-based reference values may not be sensitive enough to detect individual stress reactions and that the H/L ratio as an indicator of stress under commercial conditions may not be useful because of confounding factors and that other, non-invasive, measurements should be adopted.
Resumo:
This study assessed and compared sociodemographic and income characteristics along with food and physical activity assets (i.e. grocery stores, fast food restaurants, and park areas) in the Texas Childhood Obesity Research Demonstration (CORD) Study intervention and comparison catchment areas in Houston and Austin, Texas. The Texas CORD Study used a quasi-experimental study design, so it is necessary to establish the interval validity of the study characteristics by confirming that the intervention and comparison catchment areas are statistically comparable. In this ecological study, ArcGIS and Esri Business Analyst were used to spatially relate U.S. Census Bureau and other business listing data to the specific school attendance zones within the catchment areas. T-tests were used to compare percentages of sociodemographic and income characteristics and densities of food and physical activity assets between the intervention and comparison catchment areas.^ Only five variables were found to have significant differences between the intervention and comparison catchment areas: Age groups 0-4 and 35-64, the percentage of owner-occupied and renter-occupied households, and the percentage of Asian and Pacific Islander residents. All other variables showed no significant differences between the two groups. This study shows that the methodology used to select intervention and comparison catchment areas for the Texas CORD Study was effective and can be used in future studies. The results of this study can be used in future Texas CORD studies to confirm the comparability of the intervention and comparison catchment areas. In addition, this study demonstrates a methodology for describing detailed characteristics about a geographic area that practitioners, researchers, and educators can use.^
Resumo:
Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science
Resumo:
Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science
Resumo:
Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science
Resumo:
Since the middle of the twentieth century criticism towards quantitative research tools in social sciences has gradually led to attempts to find a new methodology, called 'qualitative research'. At the same time, qualitative research has called for a reconsideration of the usefulness of many of the beneficial tools and methodologies that were discarded during the move to research based on the employment of quantitative research tools. The purpose of this paper is to discuss the essential elements of the qualitative research approach, and then argue for the possibility of introducing the old-established methodology of historical science into qualitative research, in order to raise the accuracy of the qualitative data.