965 resultados para Curva de Phillips
Resumo:
73 p.
Resumo:
Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables
Resumo:
The classic slave narrative recounted a fugitive slave’s personal story condemning slavery and hence working towards abolition. The neo-slave narrative underlines the slave’s historical legacy by unveiling the past through foregrounding African Atlantic experiences in an attempt to create a critical historiography of the Black Atlantic. The neo-slave narrative is a genre that emerged following World War II and presents us with a dialogue combining the history of 1970 - 2000. In this thesis I seek to explore how the contemporary counter-part of the classic slave narrative draws, reflects or diverges from the general conventions of its predecessor. I argue that by scrutinizing our notion of truth, the neo-slave narrative remains a relevant, important witness to the history of slavery as well as to today’s still racialized society. The historiographic metafiction of the neo-slave narrative rewrites history with the goal of digesting the past and ultimately leading to future reconciliation.
Resumo:
La ecografía básica cardiaca (ECB) es una herramienta útil en la Unidad de Cuidados intensivos al facilitar la realización de ciertas intervenciones. No se ha definido el número de repeticiones necesarias para obtener un nivel de competencia adecuado. La evidencia encontrada indica un número mínimo de cincuenta repeticiones, para alcanzar cierto grado de habilidad.
Resumo:
O objetivo deste trabalho foi determinar a curva glicêmica de juvenis de tambaqui alimentados com dietas contendo 35 e 55% de carboidratos.
Resumo:
2015
Resumo:
2016
Resumo:
Introduction Many bilinguals will have had the experience of unintentionally reading something in a language other than the intended one (e.g. MUG to mean mosquito in Dutch rather than a receptacle for a hot drink, as one of the possible intended English meanings), of finding themselves blocked on a word for which many alternatives suggest themselves (but, somewhat annoyingly, not in the right language), of their accent changing when stressed or tired and, occasionally, of starting to speak in a language that is not understood by those around them. These instances where lexical access appears compromised and control over language behavior is reduced hint at the intricate structure of the bilingual lexical architecture and the complexity of the processes by which knowledge is accessed and retrieved. While bilinguals might tend to blame word finding and other language problems on their bilinguality, these difficulties per se are not unique to the bilingual population. However, what is unique, and yet far more common than is appreciated by monolinguals, is the cognitive architecture that subserves bilingual language processing. With bilingualism (and multilingualism) the rule rather than the exception (Grosjean, 1982), this architecture may well be the default structure of the language processing system. As such, it is critical that we understand more fully not only how the processing of more than one language is subserved by the brain, but also how this understanding furthers our knowledge of the cognitive architecture that encapsulates the bilingual mental lexicon. The neurolinguistic approach to bilingualism focuses on determining the manner in which the two (or more) languages are stored in the brain and how they are differentially (or similarly) processed. The underlying assumption is that the acquisition of more than one language requires at the very least a change to or expansion of the existing lexicon, if not the formation of language-specific components, and this is likely to manifest in some way at the physiological level. There are many sources of information, ranging from data on bilingual aphasic patients (Paradis, 1977, 1985, 1997) to lateralization (Vaid, 1983; see Hull & Vaid, 2006, for a review), recordings of event-related potentials (ERPs) (e.g. Ardal et al., 1990; Phillips et al., 2006), and positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies of neurologically intact bilinguals (see Indefrey, 2006; Vaid & Hull, 2002, for reviews). Following the consideration of methodological issues and interpretative limitations that characterize these approaches, the chapter focuses on how the application of these approaches has furthered our understanding of (1) selectivity of bilingual lexical access, (2) distinctions between word types in the bilingual lexicon and (3) control processes that enable language selection.
Resumo:
Theories that inform pedagogical practices have positioned young children as innocent, pre-political and egocentric. This paper draws from an action research study that investigates the impact of “transformative storytelling”, where stories purposefully crafted to counter metanarratives, revealed the impact of human greed with one class of children aged five to six years of age. Derrida’s notion of “cinders” provided a concept for investigating the traces or imprints the language of story left behind, amidst the children’s comments and actions, enabling the possibilities of the history of these “cinders” (that is what informed these comments and actions) to be noticed. Readings of some of the children’s responses suggest that children aged five and six years can engage in political discourse through the provocation of “transformative storytelling”, and that their engagement demonstrated the consideration of others through critical awareness and intersubjectivity. These early readings raise questions regarding curriculum content and pedagogical practices in early years education and the validity of ongoing educational goals that incorporate critical awareness and intersubjectivity to equip students with communitarian strategies to counter the individualistic outlook of neoliberalist societies.
Resumo:
The validation of Computed Tomography (CT) based 3D models takes an integral part in studies involving 3D models of bones. This is of particular importance when such models are used for Finite Element studies. The validation of 3D models typically involves the generation of a reference model representing the bones outer surface. Several different devices have been utilised for digitising a bone’s outer surface such as mechanical 3D digitising arms, mechanical 3D contact scanners, electro-magnetic tracking devices and 3D laser scanners. However, none of these devices is capable of digitising a bone’s internal surfaces, such as the medullary canal of a long bone. Therefore, this study investigated the use of a 3D contact scanner, in conjunction with a microCT scanner, for generating a reference standard for validating the internal and external surfaces of a CT based 3D model of an ovine femur. One fresh ovine limb was scanned using a clinical CT scanner (Phillips, Brilliance 64) with a pixel size of 0.4 mm2 and slice spacing of 0.5 mm. Then the limb was dissected to obtain the soft tissue free bone while care was taken to protect the bone’s surface. A desktop mechanical 3D contact scanner (Roland DG Corporation, MDX 20, Japan) was used to digitise the surface of the denuded bone. The scanner was used with the resolution of 0.3 × 0.3 × 0.025 mm. The digitised surfaces were reconstructed into a 3D model using reverse engineering techniques in Rapidform (Inus Technology, Korea). After digitisation, the distal and proximal parts of the bone were removed such that the shaft could be scanned with a microCT (µCT40, Scanco Medical, Switzerland) scanner. The shaft, with the bone marrow removed, was immersed in water and scanned with a voxel size of 0.03 mm3. The bone contours were extracted from the image data utilising the Canny edge filter in Matlab (The Mathswork).. The extracted bone contours were reconstructed into 3D models using Amira 5.1 (Visage Imaging, Germany). The 3D models of the bone’s outer surface reconstructed from CT and microCT data were compared against the 3D model generated using the contact scanner. The 3D model of the inner canal reconstructed from the microCT data was compared against the 3D models reconstructed from the clinical CT scanner data. The disparity between the surface geometries of two models was calculated in Rapidform and recorded as average distance with standard deviation. The comparison of the 3D model of the whole bone generated from the clinical CT data with the reference model generated a mean error of 0.19±0.16 mm while the shaft was more accurate(0.08±0.06 mm) than the proximal (0.26±0.18 mm) and distal (0.22±0.16 mm) parts. The comparison between the outer 3D model generated from the microCT data and the contact scanner model generated a mean error of 0.10±0.03 mm indicating that the microCT generated models are sufficiently accurate for validation of 3D models generated from other methods. The comparison of the inner models generated from microCT data with that of clinical CT data generated an error of 0.09±0.07 mm Utilising a mechanical contact scanner in conjunction with a microCT scanner enabled to validate the outer surface of a CT based 3D model of an ovine femur as well as the surface of the model’s medullary canal.
Resumo:
This publication is the culmination of a 2 year Australian Learning and Teaching Council's Project Priority Programs Research Grant which investigates key issues and challenges in developing flexible guidelines lines for best practice in Australian Doctoral and Masters by Research Examination, encompassing the two modes of investigation, written and multi-modal (practice-led/based) theses, their distinctiveness and their potential interplay. The aims of the project were to address issues of assessment legitimacy raised by the entry of practice-orientated dance studies into Australian higher degrees; examine literal embodiment and presence, as opposed to cultural studies about states of embodiment; foreground the validity of questions around subjectivity and corporeal intelligence/s and the reliability of artistic/aesthetic communications, and finally to celebrate ‘performance mastery’(Melrose 2003) as a rigorous and legitimate mode of higher research. The project began with questions which centred around: the functions of higher degree dance research; concepts of 'master-ness’ and ‘doctorateness’; the kinds of languages, structures and processes which may guide candidates, supervisors, examiners and research personnel; the purpose of evaluation/examination; addressing positive and negative attributes of examination. Finally the study examined ways in which academic/professional, writing/dancing, tradition/creation and diversity/consistency relationships might be fostered to embrace change. Over two years, the authors undertook a qualitative national study encompassing a triangulation of semi-structured face to face interviews and industry forums to gather views from the profession, together with an analysis of existing guidelines, and recent literature in the field. The most significant primary data emerged from 74 qualitative interviews with supervisors, examiners, research deans and administrators, and candidates in dance and more broadly across the creative arts. Qualitative data gathered from the two primary sources, was coded and analysed using the NVivo software program. Further perspectives were drawn from international consultant and dance researcher Susan Melrose, as well as publications in the field, and initial feedback from a draft document circulated at the World Dance Alliance Global Summit in July 2008 in Brisbane. Refinement of data occurred in a continual sifting process until the final publication was produced. This process resulted in a set of guidelines in the form of a complex dynamic system for both product and process oriented outcomes of multi-modal theses, along with short position papers on issues which arose from the research such as contested definitions, embodiment and ephemerality, ‘liveness’ in performance research higher degrees, dissolving theory/practice binaries, the relationship between academe and industry, documenting practices and a re-consideration of the viva voce.
Resumo:
Areal bone mineral density (aBMD) is the most common surrogate measurement for assessing the bone strength of the proximal femur associated with osteoporosis. Additional factors, however, contribute to the overall strength of the proximal femur, primarily the anatomical geometry. Finite element analysis (FEA) is an effective and widely used computerbased simulation technique for modeling mechanical loading of various engineering structures, providing predictions of displacement and induced stress distribution due to the applied load. FEA is therefore inherently dependent upon both density and anatomical geometry. FEA may be performed on both three-dimensional and two-dimensional models of the proximal femur derived from radiographic images, from which the mechanical stiffness may be redicted. It is examined whether the outcome measures of two-dimensional FEA, two-dimensional, finite element analysis of X-ray images (FEXI), and three-dimensional FEA computed stiffness of the proximal femur were more sensitive than aBMD to changes in trabecular bone density and femur geometry. It is assumed that if an outcome measure follows known trends with changes in density and geometric parameters, then an increased sensitivity will be indicative of an improved prediction of bone strength. All three outcome measures increased non-linearly with trabecular bone density, increased linearly with cortical shell thickness and neck width, decreased linearly with neck length, and were relatively insensitive to neck-shaft angle. For femoral head radius, aBMD was relatively insensitive, with two-dimensional FEXI and threedimensional FEA demonstrating a non-linear increase and decrease in sensitivity, respectively. For neck anteversion, aBMD decreased non-linearly, whereas both two-dimensional FEXI and three dimensional FEA demonstrated a parabolic-type relationship, with maximum stiffness achieved at an angle of approximately 15o. Multi-parameter analysis showed that all three outcome measures demonstrated their highest sensitivity to a change in cortical thickness. When changes in all input parameters were considered simultaneously, three and twodimensional FEA had statistically equal sensitivities (0.41±0.20 and 0.42±0.16 respectively, p = ns) that were significantly higher than the sensitivity of aBMD (0.24±0.07; p = 0.014 and 0.002 for three-dimensional and two-dimensional FEA respectively). This simulation study suggests that since mechanical integrity and FEA are inherently dependent upon anatomical geometry, FEXI stiffness, being derived from conventional two-dimensional radiographic images, may provide an improvement in the prediction of bone strength of the proximal femur than currently provided by aBMD.