930 resultados para method of inquiry
Resumo:
Thermal cutting methods, are commonly used in the manufacture of metal parts. Thermal cutting processes separate materials by using heat. The process can be done with or without a stream of cutting oxygen. Common processes are Oxygen, plasma and laser cutting. It depends on the application and material which cutting method is used. Numerically-controlled thermal cutting is a cost-effective way of prefabricating components. One design aim is to minimize the number of work steps in order to increase competitiveness. This has resulted in the holes and openings in plate parts manufactured today being made using thermal cutting methods. This is a problem from the fatigue life perspective because there is local detail in the as-welded state that causes a rise in stress in a local area of the plate. In a case where the static utilization of a net section is full used, the calculated linear local stresses and stress ranges are often over 2 times the material yield strength. The shakedown criteria are exceeded. Fatigue life assessment of flame-cut details is commonly based on the nominal stress method. For welded details, design standards and instructions provide more accurate and flexible methods, e.g. a hot-spot method, but these methods are not universally applied to flame cut edges. Some of the fatigue tests of flame cut edges in the laboratory indicated that fatigue life estimations based on the standard nominal stress method can give quite a conservative fatigue life estimate in cases where a high notch factor was present. This is an undesirable phenomenon and it limits the potential for minimizing structure size and total costs. A new calculation method is introduced to improve the accuracy of the theoretical fatigue life prediction method of a flame cut edge with a high stress concentration factor. Simple equations were derived by using laboratory fatigue test results, which are published in this work. The proposed method is called the modified FAT method (FATmod). The method takes into account the residual stress state, surface quality, material strength class and true stress ratio in the critical place.
Resumo:
My research permitted me to reexamine my recent evaluations of the Leaf Project given to the Foundation Year students during the fall semester of 1997. My personal description of the drawing curriculum formed part of the matrix of the Foundation Core Studies at the Ontario College of Art and Design. Research was based on the random selection of 1 8 students distributed over six of my teaching groups. The entire process included a representation of all grade levels. The intent of the research was to provide a pattern of alternative insights that could provide a more meaningful method of evaluation for visual learners in an art education setting. Visual methods of learning are indeed complex and involve the interplay of many sensory modalities of input. Using a qualitative method of research analysis, a series of queries were proposed into a structured matrix grid for seeking out possible and emerging patterns of learning. The grid provided for interrelated visual and linguistic analysis with emphasis in reflection and interconnectedness. Sensory-based modes of learning are currently being studied and discussed amongst educators as alternative approaches to learning. As patterns emerged from the research, it became apparent that a paradigm for evaluation would have to be a progressive profile of the learning that would take into account many of the different and evolving learning processes of the individual. A broader review of the student's entire development within the Foundation Year Program would have to have a shared evaluation through a cross section of representative faculty in the program. The results from the research were never intended to be conclusive. We realized from the start that sensory-based learning is a difficult process to evaluate from traditional standards used in education. The potential of such a process of inquiry permits the researcher to ask for a set of queries that might provide for a deeper form of evaluation unique to the students and their related learning environment. Only in this context can qualitative methods be used to profile their learning experiences in an expressive and meaningful manner.
Resumo:
The purpose of this study was to understand the Hved experience of 6 women with recurrent ovarian cancer. Six women were interviewed 2-20 weeks after the recurrence of their ovarian cancer. Interview questions focused on the meaning of the recurrence and their communication with others. Women were asked about the information and support that they felt they needed at that time, van Manen's method of reflection and writing guided the inquiry. Analysis of the data revealed the themes of: my cancer is back; it means that I will die; talking about it; we are people, we are not a disease; information; and life has changed/life hasn't changed. This study revealed the perspectives of these 6 women with recurrent ovarian cancer. It provided an understanding and knowledge about the lives of these women. Future research should explore the experiences of a larger group of women with recurrent ovarian cancer in order to address their unique needs.
Resumo:
This exploratory mixed method research project was designed to investigate an area of doctoral education that has received little attention in the past. This research focused specifically on the non-intellectual, hoped-for by-products of doctoral education; the dynamic processes of developing and maintaining both a sense of community and informal mentoring relationships. The design of the study captured the experiences of doctoral students and alumni at various time periods in the doctoral program. Participants represented a diverse group of students with differences in professional and academic backgrounds and life stages. A pilot study for this research suggested that the presence of a sense of community and informal mentoring may provide the necessary relationships to support this diversity. The primary question at the forefront of this study was: Do doctoral students feel connected to one another? Five subquestions were developed to address this research topic: Does a sense of community already exist and flourish in doctoral education? Are the programs and resources of the doctoral program organized to nurture the creation and maintenance of a sense of community? Is a sense of community a foundational element in the formation of naturally occurring relationships among doctoral students? What educational and socio-emotional benefits are associated with informal mentoring relationships during the doctoral experience? and Do doctoral students perceive a change in their development as stewards of their discipline over time? The principal methods used to investigate these research questions combined both quantitative and qualitative techniques in a concurrent time sequence. The quantitative portion of the study involved a questionnaire, while the qualitative portion involved two approaches; face-to-face interviews and an open-ended question at the end of the questionnaire. Findings from the study indicated that the presence of both sense of community and informal mentoring enhance the overall quality of doctoral education. Program elements that enhanced or hindered connection between students were identified. Both the dynamics and the emotional, social, and academic benefits of informal mentoring were elucidated. Over time participants perceived changes in their development of the qualities assqciated with stewardship. This study brought the "hoped-for by-products" associated with doctoral education from the background shadows to an illuminated position at the forefront of inquiry.
Resumo:
We o¤er an axiomatization of the serial cost-sharing method of Friedman and Moulin (1999). The key property in our axiom system is Group Demand Monotonicity, asking that when a group of agents raise their demands, not all of them should pay less.
Resumo:
While channel coding is a standard method of improving a system’s energy efficiency in digital communications, its practice does not extend to high-speed links. Increasing demands in network speeds are placing a large burden on the energy efficiency of high-speed links and render the benefit of channel coding for these systems a timely subject. The low error rates of interest and the presence of residual intersymbol interference (ISI) caused by hardware constraints impede the analysis and simulation of coded high-speed links. Focusing on the residual ISI and combined noise as the dominant error mechanisms, this paper analyses error correlation through concepts of error region, channel signature, and correlation distance. This framework provides a deeper insight into joint error behaviours in high-speed links, extends the range of statistical simulation for coded high-speed links, and provides a case against the use of biased Monte Carlo methods in this setting
Resumo:
The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.
Resumo:
LDL oxidation may be important in atherosclerosis. Extensive oxidation of LDL by copper induces increased uptake by macrophages, but results in decomposition of hydroperoxides, making it more difficult to investigate the effects of hydroperoxides in oxidised LDL on cell function. We describe here a simple method of oxidising LDL by dialysis against copper ions at 4 degrees C, which inhibits the decomposition of hydroperoxides, and allows the production of LDL rich in hydroperoxides (626 +/- 98 nmol/mg LDL protein) but low in oxysterols (3 +/- 1 nmol 7-ketocholesterol/mg LDL protein), whilst allowing sufficient modification (2.6 +/- 0.5 relative electrophoretic mobility) for rapid uptake by macrophages (5.49 +/- 0.75 mu g I-125-labelled hydroperoxide-rich LDL vs. 0.46 +/- 0.04 mu g protein/mg cell protein in 18 h for native LDL). By dialysing under the same conditions, but at 37 degrees C, the hydroperoxides are decomposed extensively and the LDL becomes rich in oxysterols. This novel method of oxidising LDL with high yield to either a hydroperoxide- or oxysterol-rich form by simply altering the temperature of dialysis may provide a useful tool for determining the effects of these different oxidation products on cell function. (C) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The adaptive thermal comfort theory considers people as active rather than passive recipients in response to ambient physical thermal stimuli, in contrast with conventional, heat-balance-based, thermal comfort theory. Occupants actively interact with the environments they occupy by means of utilizing adaptations in terms of physiological, behavioural and psychological dimensions to achieve ‘real world’ thermal comfort. This paper introduces a method of quantifying the physiological, behavioural and psychological portions of the adaptation process by using the analytic hierarchy process (AHP) based on the case studies conducted in the UK and China. Apart from three categories of adaptations which are viewed as criteria, six possible alternatives are considered: physiological indices/health status, the indoor environment, the outdoor environment, personal physical factors, environmental control and thermal expectation. With the AHP technique, all the above-mentioned criteria, factors and corresponding elements are arranged in a hierarchy tree and quantified by using a series of pair-wise judgements. A sensitivity analysis is carried out to improve the quality of these results. The proposed quantitative weighting method provides researchers with opportunities to better understand the adaptive mechanisms and reveal the significance of each category for the achievement of adaptive thermal comfort.
Resumo:
Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.
Resumo:
Background: The method of porosity analysis by water absorption has been carried out by the storage of the specimens in pure water, but it does not exclude the potential plasticising effect of the water generating unreal values of porosity. Objective: The present study evaluated the reliability of this method of porosity analysis in polymethylmethacrylate denture base resins by the determination of the most satisfactory solution for storage (S), where the plasticising effect was excluded. Materials and methods: Two specimen shapes (rectangular and maxillary denture base) and two denture base resins, water bath-polymerised (Classico) and microwave-polymerised (Acron MC) were used. Saturated anhydrous calcium chloride solutions (25%, 50%, 75%) and distilled water were used for specimen storage. Sorption isotherms were used to determine S. Porosity factor (PF) and diffusion coefficient (D) were calculated within S and for the groups stored in distilled water. anova and Tukey tests were performed to identify significant differences in PF results and Kruskal-Wallis test and Dunn multiple comparison post hoc test, for D results (alpha = 0.05). Results: For Acron MC denture base shape, FP results were 0.24% (S 50%) and 1.37% (distilled water); for rectangular shape FP was 0.35% (S 75%) and 0.19% (distilled water). For Classico denture base shape, FP results were 0.54% (S 75%) and 1.21% (distilled water); for rectangular shape FP was 0.7% (S 50%) and 1.32% (distilled water). FP results were similar in S and distilled water only for Acron MC rectangular shape (p > 0.05). D results in distilled water were statistically higher than S for all groups. Conclusions: The results of the study suggest that an adequate solution for storing specimens must be used to measure porosity by water absorption, based on excluding the plasticising effect.
Resumo:
Nuclear (p,alpha) reactions destroying the so-called ""light-elements"" lithium, beryllium and boron have been largely studied in the past mainly because their role in understanding some astrophysical phenomena, i.e. mixing-phenomena occurring in young F-G stars [1]. Such mechanisms transport the surface material down to the region close to the nuclear destruction zone, where typical temperatures of the order of similar to 10(6) K are reached. The corresponding Gamow energy E(0)=1.22 (Z(x)(2)Z(X)(2)T(6)(2))(1/3) [2] is about similar to 10 keV if one considers the ""boron-case"" and replaces in the previous formula Z(x) = 1, Z(X) = 5 and T(6) = 5. Direct measurements of the two (11)B(p,alpha(0))(8)Be and (10)B(p,alpha)(7)Be reactions in correspondence of this energy region are difficult to perform mainly because the combined effects of Coulomb barrier penetrability and electron screening [3]. The indirect method of the Trojan Horse (THM) [4-6] allows one to extract the two-body reaction cross section of interest for astrophysics without the extrapolation-procedures. Due to the THM formalism, the extracted indirect data have to be normalized to the available direct ones at higher energies thus implying that the method is a complementary tool in solving some still open questions for both nuclear and astrophysical issues [7-12].
Resumo:
Despite the many existing crosslinking procedures, glutaraldehyde (GA) is still the method of choice used in the manufacture of bioprosthesis. The major problems with GA are: (a) uncontrolled reactivity due to the chemical complexity or GA solutions; (b) toxicity due to the release of GA from polymeric crosslinks; and (c) tissue impermeabilization due to polymeric and heterogeneous crosslinks formation, partially responsible for the undesirable calcification of the bioprosthesis. A new method of crosslinking glutaraldehyde acetals has been developed with GA in acid ethanolic solution, and after the distribution inside de matrix, GA is released to crosslinking. Concentrations of hydrochloride acid in ethanolic solutions between 0.1 and 0.001 mol/L with GA concentration between 0.1 and 1.0% were measured in an ultraviolet spectrophotometer to verify the presence of free aldehyde groups and polymeric compounds of GA. After these measurements, the solutions were used to crosslink bovine pericardium. The spectrophotometric results showed that GA was better protected in acetal forms for acid ethanolic solution with HCl at 0.003 mol/L and GA 1.0%(v/v). The shrinkage temperature results of bovine pericardium crosslinked with acetal solutions showed values near 85 C after the exposure to triethylamine vapors.
Resumo:
Genes on the X chromosome are known to be responsible for more than 200 hereditary diseases. After IVF, the simple selection of embryo sex before uterine transfer can prevent the occurrence of affected offspring among couples at risk for these genetic disorders. The aim of this investigation was to develop a rapid method of preimplantation genetic diagnosis (PGD) using real-time polymerase chain reaction (PCR) for the sexing of human embryos, and to compare it to the fluorescence in-situ hybridization technique, considered to be the gold standard. After biopsies were obtained from 40 surplus non-viable embryos for transfer, a total of 98 blastomeres were analysed. It was possible to analyse 24 embryos (60%) by both techniques, generating a total of 70 blastomeres (35 per technique), white 28 blastomeres from 16 embryos (40%) were analysed only by real-time PCR. A rapid and safe method was developed in the present study for the sexual diagnosis of a single human cell (blastomere and buccal cell) using the emerging technology of real-time PCR. (C) 2009, Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)