822 resultados para Methodological flexibility
Resumo:
This research compares the methodological tools employed in NOS research, with analysis of what the comparison implies about the structure of nature of science knowledge. Descriptions of practicing teachers’ nature of science conceptions were compared based on data collected from forced choice responses, responses to a qualitative survey, and course writing samples. Participants’ understandings were scored differently on the Views of Nature of Science Questionnaire (VNOS) than the forced-choice measure, Scientific Thinking and Internet Learning Technologies (STILT). In addition, analysis of the writing samples and observations combined with interviews portrayed more sophisticated, but more variable, understandings of the nature of science than was evidenced by either the survey or the forced-choice measure. The differences between data collection measures included the degree to which they drew upon context bound or context general reasoning, the degree to which they required students to move beyond the simple intelligibility of their responses and allowed students to explore the fruitfulness of the constructs, as well as the degree to which they revealed the interconnection of participants NOS conceptions. In light of the different portrayals of a participants NOS conceptions yielded by these different measures, we call for the use of crystallization as a methodological referent in research.
Resumo:
Recent studies have reported positive associations between maternal exposures to air pollutants and several adverse birth outcomes. However, there have been no assessments of the association between environmental hazardous air pollutants (HAPs) such as benzene, toluene, ethylbenzene, and xylene (BTEX) and neural tube defects (NTDs) a common and serious group of congenital malformations. Before examining this association, two important methodological questions must be addressed: (1) is maternal residential movement likely to result in exposure misclassification and (2) is it appropriate to lump defects of the neural tube, such as anencephaly and spina bifida, into a composite disease endpoint (i.e., NTDs). ^ Data from the National Birth Defects Prevention Study and Texas Birth Defects Registry were used to: (1) assess the extent to which change of residence may result in exposure misclassification when exposure is based on the address at delivery; (2) formally assess heterogeneity of the associations between known risk factors for NTDs, using polytomous logistic regression; and (3) conduct a case-control study assessing the association between ambient air levels of BTEX and the risk of NTDs among offspring. ^ Regarding maternal residential mobility, this study suggests address at delivery was not significantly different from using address at conception when assigning quartile of benzene exposure (OR 1.0, 95% CI 0.9, 1.3). On the question of effect heterogeneity among NTDs, the effect estimates for infant sex P = 0.017), maternal body mass index P = 0.016), and folate supplementation P = 0.050) were significantly different for anencephaly and spina bifida, suggesting it is often more appropriate to assess potential risk factors among subgroups of NTDs. For the main study question on the association between environmental HAPs and NTDs, mothers who have offspring with isolated spina bifida are 2.4 times likely to live in areas with the highest benzene levels (95% CI 1.1, 5.0). However, no other significant associations were observed.^ This project is the first to include not only an assessment of the relationship between environmental levels of BTEX and NTDs, but also two separate studies addressing important methodological issues associated with this question. Our results contribute to the growing body of evidence regarding air pollutant exposure and adverse birth outcomes. ^
Resumo:
In recent years, disaster preparedness through assessment of medical and special needs persons (MSNP) has taken a center place in public eye in effect of frequent natural disasters such as hurricanes, storm surge or tsunami due to climate change and increased human activity on our planet. Statistical methods complex survey design and analysis have equally gained significance as a consequence. However, there exist many challenges still, to infer such assessments over the target population for policy level advocacy and implementation. ^ Objective. This study discusses the use of some of the statistical methods for disaster preparedness and medical needs assessment to facilitate local and state governments for its policy level decision making and logistic support to avoid any loss of life and property in future calamities. ^ Methods. In order to obtain precise and unbiased estimates for Medical Special Needs Persons (MSNP) and disaster preparedness for evacuation in Rio Grande Valley (RGV) of Texas, a stratified and cluster-randomized multi-stage sampling design was implemented. US School of Public Health, Brownsville surveyed 3088 households in three counties namely Cameron, Hidalgo, and Willacy. Multiple statistical methods were implemented and estimates were obtained taking into count probability of selection and clustering effects. Statistical methods for data analysis discussed were Multivariate Linear Regression (MLR), Survey Linear Regression (Svy-Reg), Generalized Estimation Equation (GEE) and Multilevel Mixed Models (MLM) all with and without sampling weights. ^ Results. Estimated population for RGV was 1,146,796. There were 51.5% female, 90% Hispanic, 73% married, 56% unemployed and 37% with their personal transport. 40% people attained education up to elementary school, another 42% reaching high school and only 18% went to college. Median household income is less than $15,000/year. MSNP estimated to be 44,196 (3.98%) [95% CI: 39,029; 51,123]. All statistical models are in concordance with MSNP estimates ranging from 44,000 to 48,000. MSNP estimates for statistical methods are: MLR (47,707; 95% CI: 42,462; 52,999), MLR with weights (45,882; 95% CI: 39,792; 51,972), Bootstrap Regression (47,730; 95% CI: 41,629; 53,785), GEE (47,649; 95% CI: 41,629; 53,670), GEE with weights (45,076; 95% CI: 39,029; 51,123), Svy-Reg (44,196; 95% CI: 40,004; 48,390) and MLM (46,513; 95% CI: 39,869; 53,157). ^ Conclusion. RGV is a flood zone, most susceptible to hurricanes and other natural disasters. People in the region are mostly Hispanic, under-educated with least income levels in the U.S. In case of any disaster people in large are incapacitated with only 37% have their personal transport to take care of MSNP. Local and state government’s intervention in terms of planning, preparation and support for evacuation is necessary in any such disaster to avoid loss of precious human life. ^ Key words: Complex Surveys, statistical methods, multilevel models, cluster randomized, sampling weights, raking, survey regression, generalized estimation equations (GEE), random effects, Intracluster correlation coefficient (ICC).^
Resumo:
The Blood Pressure Study in Mexican Children (BPSMC) is a short term longitudinal study of serial blood pressure collected in three observation periods by standardized examinations of 233 female children, 10 to 12 years of age, enrolled in public and private primary schools in Tlalpan, Mexico. Study objectives were: (1) to describe from baseline information the distribution and relationship of blood pressure to age and selected anthropometric factors, as well as to compare the BPSMC results with other blood pressure studies, (2) to examine the sources and amount of variation present in serial blood pressure of 123 children, and (3) to evaluate observer performance by means of intra- and inter-observer variability.^ Stepwise regression results from baseline revealed that of all anthropometric factors and age, weight was the best predictor for blood pressure.^ The results of serial blood pressure measurements show that, besides the known sources of blood pressure variability (subject, day, reading), the physiologic event of menarche has an important bearing upon the variability and characterization of blood pressure in young girls. The assessment of the effects of blood pressure variability and reliability upon the design and analysis of epidemiologic studies, became apparent among post-menarcheal girls; where blood pressure measurements taken from them have low reliability. Research is needed to propose alternatives for assessing blood pressure during puberty.^ Finally, observer performance of blood pressure and anthropometry were evaluated. Anthropometric measurements had reliabilities in excess of R = 0.96. Acceptable reliabilities (R = 0.88 to 0.95) were obtained for systolic and diastolic (phase 4 and 5) blood pressures. The BPSMC showed a 50 percent decrease in measurement error from the first to the third observation periods. ^
Resumo:
La globalización en la Educación Superior y el nuevo marco de mercado dentro del sistema universitario ha introducido el concepto de flexibilidad. Uno de Ios instrumentos principales en el desarrollo de este proceso ha sido el Sistema de Crédito Académico (el Marco de Crédito), que es un modo de reestructurar el currículum hacia la Flexibilidad de Suministros. Muchos países en el mundo entero han introducido este modelo, después del Sistema de Crédito americano. Este trabajo explorará conceptualmente los eslabones entre las tendencias principales en la enseñanza superior y este nuevo concepto.
Resumo:
Since the middle of the twentieth century criticism towards quantitative research tools in social sciences has gradually led to attempts to find a new methodology, called 'qualitative research'. At the same time, qualitative research has called for a reconsideration of the usefulness of many of the beneficial tools and methodologies that were discarded during the move to research based on the employment of quantitative research tools. The purpose of this paper is to discuss the essential elements of the qualitative research approach, and then argue for the possibility of introducing the old-established methodology of historical science into qualitative research, in order to raise the accuracy of the qualitative data.
A Methodological model to assist the optimization and risk management of mining investment decisions
Resumo:
Identifying, quantifying, and minimizing technical risks associated with investment decisions is a key challenge for mineral industry decision makers and investors. However, risk analysis in most bankable mine feasibility studies are based on the stochastic modelling of project “Net Present Value” (NPV)which, in most cases, fails to provide decision makers with a truly comprehensive analysis of risks associated with technical and management uncertainty and, as a result, are of little use for risk management and project optimization. This paper presents a value-chain risk management approach where project risk is evaluated for each step of the project lifecycle, from exploration to mine closure, and risk management is performed as a part of a stepwise value-added optimization process.
Resumo:
Next generation telecommunications infrastructures are considered as a principal example of a new technology for sustainable economic growth. From their deployment it is expected that a wealth of innovations – hopefully converted into economic growth – new sources of employment and improved quality of life will result. In line with these prospects, public administrations at supranational, national, regional and local levels have encouraged the development of these new infrastructures. Moreover, in times of economic crisis, public assistance to deploy such networks encompasses the promise of placing a weak economy on the road to prosperity. However, such arguments and political claims clearly require rigorous assessment. In particular, any such assessment must adequately address the appropriate form of modelling that best captures key elements for identifiable progress from next generation access networks (NGAN).
Resumo:
The airline industry is often unstable and unpredictable forcing airlines to restructure and create flexible strategies that can respond to external operating environmental changes. In turbulent and competitive environments, firms with higher flexibility perform better and the value of these flexibilities depends on factors of uncertainty in the competitive environment. A model is sought for and arrived at, that shows how an airline business model will function in an uncertain environment with the least reduction in business performance over time. An analysis of the business model flexibility of 17 Airlines from Asia, Europe and Oceania, that is done with core competence as the indicator reveals a picture of inconsistencies in the core competence strategy of certain airlines and the corresponding reduction in business performance. The performance variations are explained from a service oriented core competence strategy employed by airlines that ultimately enables them in having a flexible business model that not only increases business performance but also helps in reducing the uncertainties in the internal and external operating environments.
Resumo:
This work presents a method for the analysis of timber composite beams which considers the slip in the connection system, based on assembling the flexibility matrix of the whole structure. This method is based on one proposed by Tommola and Jutila (2001). This paper extends the method to the case of a gap between two pieces with an arbitrary location at the first connector, which notably broadens its practical application. The addition of the gap makes it possible to model a cracked zone in concrete topping, as well as the case in which forming produces the gap. The consideration of induced stresses due to changes in temperature and moisture content is also described, while the concept of equivalent eccentricity is generalized. This method has important advantages in connection with the current European Standard EN 1995-1-1: 2004, as it is able to deal with any type of load, variable section, discrete and non-regular connection systems, a gap between the two pieces, and variations in temperature and moisture content. Although it could be applied to any structural system, it is specially suited for the case of simple supported and continuous beams. Working examples are presented at the end, showing that the arrangement of the connection notably modifies shear force distribution. A first interpretation of the results is made on the basis of the strut and tie theory. The examples prove that the use of EC-5 is unsafe when, as a rule of thumb, the strut or compression field between the support and the first connector is at an angle with the axis of the beam of less than 60º.
Resumo:
Force sensors are used when interaction tasks are carried out by robots in general, and by climbing robots in particular. If the mechanics and electronics systems are contained inside the own robot, the robot becomes portable without external control. Commercial force sensors cannot be used due to limited space and weight. By selecting the links material with appropriate stiffness and placing strain gauges on the structure, the own robot flexibility can be used such as force sensor. Thus, forces applied on the robot tip can be measured without additional external devices. Only gauges and small internal electronic converters are necessary. This paper illustrates the proposed algorithm to achieve these measurements. Additionally, experimental results are presented.
Resumo:
We informally discuss several issues related to the parallel execution of logic programming systems and concurrent logic programming systems, and their generalization to constraint programming. We propose a new view of these systems, based on a particular definition of parallelism. We argüe that, under this view, a large number of the actual systems and models can be explained through the application, at different levéis of granularity, of only a few basic principies: determinism, non-failure, independence (also referred to as stability), granularity, etc. Also, and based on the convergence of concepts that this view brings, we sketch a model for the implementation of several parallel constraint logic programming source languages and models based on a common, generic abstract machine and an intermedíate kernel language.
Resumo:
We informally discuss several issues related to the parallel execution of logic programming systems and concurrent logic programming systems, and their generalization to constraint programming. We propose a new view of these systems, based on a particular definition of parallelism. We argüe that, under this view, a large number of the actual systems and models can be explained through the application, at different levéis of granularity, of only a few basic principies: determinism, non-failure, independence (also referred to as stability), granularity, etc. Also, and based on the convergence of concepts that this view brings, we sketch a model for the implementation of several parallel constraint logic programming source languages and models based on a common, generic abstract machine and an intermedíate kernel language.