851 resultados para non-contact analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: This study investigated the influence of injury cause, contact-sport participation, and prior knowledge of mild traumatic brain injury (mTBI) on injury beliefs and chronic symptom expectations of mTBI. Method: A total of 185 non-contact-sport players (non-CSPs) and 59 contact-sport players (CSPs) with no history of mTBI were randomly allocated to one of two conditions in which they read either a vignette depicting a sport-related mTBI (mTBIsport) or a motor-vehicle-accident-related mTBI (mTBIMVA). The vignettes were otherwise standardized to convey the same injury parameters (e.g., duration of loss of consciousness). After reading a vignette, participants reported their injury beliefs (i.e., perceptions of injury undesirability, chronicity, and consequences) and their expectations of chronic postconcussion syndrome (PCS) and posttraumatic stress disorder (PTSD) symptoms. Results: Non-CSPs held significantly more negative beliefs and expected greater PTSD symptomatology and greater PCS affective symptomatology from an mTBIMVA vignette thann mTBIsport vignette, but this difference was not found for CSPs. Unlike CSPs, non-CSPs who personally knew someone who had sustained an mTBI expected significantly less PCS symptomatology than those who did not. Despite these different results for non-CSPs and CSPs, overall, contact-sport participation did not significantly affect injury beliefs and symptom expectations from an mTBIsport. Conclusions: Expectations of persistent problems after an mTBI are influenced by factors such as injury cause even when injury parameters are held constant. Personal knowledge of mTBI, but not contact sport participation, may account for some variability in mTBI beliefs and expectations. These factors require consideration when assessing mTBI outcome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rail joints are provided with a gap to account for thermal movement and to maintain electrical insulation for the control of signals and/or broken rail detection circuits. The gap in the rail joint is regarded as a source of significant problems for the rail industry since it leads to a very short rail service life compared with other track components due to the various, and difficult to predict, failure modes – thus increasing the risk for train operations. Many attempts to improve the life of rail joints have led to a large number of patents around the world; notable attempts include strengthening through larger-sized joint bars, an increased number of bolts and the use of high yield materials. Unfortunately, no design to date has shown the ability to prolong the life of the rail joints to values close to those for continuously welded rail (CWR). This paper reports the results of a fundamental study that has revealed that the wheel contact at the free edge of the railhead is a major problem since it generates a singularity in the contact pressure and railhead stresses. A design was therefore developed using an optimisation framework that prevents wheel contact at the railhead edge. Finite element modelling of the design has shown that the contact pressure and railhead stress singularities are eliminated, thus increasing the potential to work as effectively as a CWR that does not have a geometric gap. An experimental validation of the finite element results is presented through an innovative non-contact measurement of strains. Some practical issues related to grinding rails to the optimal design are also discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Various Tb theorems play a key role in the modern harmonic analysis. They provide characterizations for the boundedness of Calderón-Zygmund type singular integral operators. The general philosophy is that to conclude the boundedness of an operator T on some function space, one needs only to test it on some suitable function b. The main object of this dissertation is to prove very general Tb theorems. The dissertation consists of four research articles and an introductory part. The framework is general with respect to the domain (a metric space), the measure (an upper doubling measure) and the range (a UMD Banach space). Moreover, the used testing conditions are weak. In the first article a (global) Tb theorem on non-homogeneous metric spaces is proved. One of the main technical components is the construction of a randomization procedure for the metric dyadic cubes. The difficulty lies in the fact that metric spaces do not, in general, have a translation group. Also, the measures considered are more general than in the existing literature. This generality is genuinely important for some applications, including the result of Volberg and Wick concerning the characterization of measures for which the analytic Besov-Sobolev space embeds continuously into the space of square integrable functions. In the second article a vector-valued extension of the main result of the first article is considered. This theorem is a new contribution to the vector-valued literature, since previously such general domains and measures were not allowed. The third article deals with local Tb theorems both in the homogeneous and non-homogeneous situations. A modified version of the general non-homogeneous proof technique of Nazarov, Treil and Volberg is extended to cover the case of upper doubling measures. This technique is also used in the homogeneous setting to prove local Tb theorems with weak testing conditions introduced by Auscher, Hofmann, Muscalu, Tao and Thiele. This gives a completely new and direct proof of such results utilizing the full force of non-homogeneous analysis. The final article has to do with sharp weighted theory for maximal truncations of Calderón-Zygmund operators. This includes a reduction to certain Sawyer-type testing conditions, which are in the spirit of Tb theorems and thus of the dissertation. The article extends the sharp bounds previously known only for untruncated operators, and also proves sharp weak type results, which are new even for untruncated operators. New techniques are introduced to overcome the difficulties introduced by the non-linearity of maximal truncations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When the cold accretion disc coupling between neutral gas and a magnetic field is so weak that the magnetorotational instability is less effective or even stops working, it is of prime interest to investigate the pure hydrodynamic origin of turbulence and transport phenomena. As the Reynolds number increases, the relative importance of the non-linear term in the hydrodynamic equation increases. In an accretion disc where the molecular viscosity is too small, the Reynolds number is large enough for the non-linear term to have new effects. We investigate the scenario of the `weakly non-linear' evolution of the amplitude of the linear mode when the flow is bounded by two parallel walls. The unperturbed flow is similar to the plane Couette flow, but with the Coriolis force included in the hydrodynamic equation. Although there is no exponentially growing eigenmode, because of the self-interaction, the least stable eigenmode will grow in an intermediate phase. Later, this will lead to higher-order non-linearity and plausible turbulence. Although the non-linear term in the hydrodynamic equation is energy-conserving, within the weakly non-linear analysis it is possible to define a lower bound of the energy (alpha A(c)(2), where A(c) is the threshold amplitude) needed for the flow to transform to the turbulent phase. Such an unstable phase is possible only if the Reynolds number >= 10(3-4). The numerical difficulties in obtaining such a large Reynolds number might be the reason for the negative result of numerical simulations on a pure hydrodynamic Keplerian accretion disc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lamb wave type guided wave propagation in foam core sandwich structures and detectability of damages using spectral analysis method are reported in this paper. An experimental study supported by theoretical evaluation of the guided wave characteristics is presented here that shows the applicability of Lamb wave type guided ultrasonic wave for detection of damage in foam core sandwich structures. Sandwich beam specimens were fabricated with 10 mm thick foam core and 0.3 mm thick aluminum face sheets. Thin piezoelectric patch actuators and sensors are used to excite and sense guided wave. Group velocity dispersion curves and frequency response of sensed signal are obtained experimentally. The nature of damping present in the sandwich panel is monitored by measuring the sensor signal amplitude at various different distances measured from the center of the linear phased array. Delaminations of increasing width are created and detected experimentally by pitch-catch interrogation with guided waves and wavelet transform of the sensed signal. Signal amplitudes are analyzed for various different sizes of damages to differentiate the damage size/severity. A sandwich panel is also fabricated with a planer dimension of 600 mm x 400 mm. Release film delamination is introduced during fabrication. Non-contact Laser Doppler Vibrometer (LDV) is used to scan the panel while exciting with a surface bonded piezoelectric actuator. Presence of damage is confirmed by the reflected wave fringe pattern obtained from the LDV scan. With this approach it is possible to locate and monitor the damages by tracking the wave packets scattered from the damages.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Microarraying involves laying down genetic elements onto a solid substrate for DNA analysis on a massively parallel scale. Microarrays are prepared using a pin-based robotic platform to transfer liquid samples from microtitre plates to an array pattern of dots of different liquids on the surface of glass slides where they dry to form spots diameter < 200 μm. This paper presents the design, materials selection, micromachining technology and performance of reservoir pins for microarraying. A conical pin is produced by (i) conventional machining of stainless steel or wet etching of tungsten wire, followed by (ii) micromachining with a focused laser to produce a microreservoir and a capillary channel structure leading from the tip. The pin has a flat end diameter < 100 μm from which a 500 μm long capillary channel < 15 μm wide leads up the pin to a reservoir. Scanning electron micrographs of the metal surface show roughness on the scale of 10 μm, but the pins nevertheless give consistent and reproducible spotting performance. The pin capacity is 80 nanolitres of fluid containing DNA, and at least 50 spots can be printed before replenishing the reservoir. A typical robot holds can hold up to 64 pins. This paper discusses the fabrication technology, the performance and spotting uniformity for reservoir pins, the possible limits to miniaturization of pins using this approach, and the future prospects for contact and non-contact arraying technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ordered granular systems have been a subject of active research for decades. Due to their rich dynamic response and nonlinearity, ordered granular systems have been suggested for several applications, such as solitary wave focusing, acoustic signals manipulation, and vibration absorption. Most of the fundamental research performed on ordered granular systems has focused on macro-scale examples. However, most engineering applications require these systems to operate at much smaller scales. Very little is known about the response of micro-scale granular systems, primarily because of the difficulties in realizing reliable and quantitative experiments, which originate from the discrete nature of granular materials and their highly nonlinear inter-particle contact forces.

In this work, we investigate the physics of ordered micro-granular systems by designing an innovative experimental platform that allows us to assemble, excite, and characterize ordered micro-granular systems. This new experimental platform employs a laser system to deliver impulses with controlled momentum and incorporates non-contact measurement apparatuses to detect the particles’ displacement and velocity. We demonstrated the capability of the laser system to excite systems of dry (stainless steel particles of radius 150 micrometers) and wet (silica particles of radius 3.69 micrometers, immersed in fluid) micro-particles, after which we analyzed the stress propagation through these systems.

We derived the equations of motion governing the dynamic response of dry and wet particles on a substrate, which we then validated in experiments. We then measured the losses in these systems and characterized the collision and friction between two micro-particles. We studied wave propagation in one-dimensional dry chains of micro-particles as well as in two-dimensional colloidal systems immersed in fluid. We investigated the influence of defects to wave propagation in the one-dimensional systems. Finally, we characterized the wave-attenuation and its relation to the viscosity of the surrounding fluid and performed computer simulations to establish a model that captures the observed response.

The findings of the study offer the first systematic experimental and numerical analysis of wave propagation through ordered systems of micro-particles. The experimental system designed in this work provides the necessary tools for further fundamental studies of wave propagation in both granular and colloidal systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

激光诱导击穿光谱技术(LIBS)具有无需样品制备,原位快速分析,可进行实时控制的特点使其在钢铁冶炼控制中具有巨大的实际应用价值。本文以波长为1 064 nm的Nd∶YAG调Q固体激光器为激发光源,CCD为探测器,高合金钢GBW01605—01609系列为样品,在建立的LIBS实验装置上研究激光与合金钢之间的相互作用。系统地研究了观测距离、激光能量对高合金钢样品中激光诱导击穿谱特性的影响,并分析了LIBS信号的时间分辨特性,确定了将LIBS用于合金钢微量元素定量分析时的最佳实验条件。

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Histopathology is the clinical standard for tissue diagnosis. However, histopathology has several limitations including that it requires tissue processing, which can take 30 minutes or more, and requires a highly trained pathologist to diagnose the tissue. Additionally, the diagnosis is qualitative, and the lack of quantitation leads to possible observer-specific diagnosis. Taken together, it is difficult to diagnose tissue at the point of care using histopathology.

Several clinical situations could benefit from more rapid and automated histological processing, which could reduce the time and the number of steps required between obtaining a fresh tissue specimen and rendering a diagnosis. For example, there is need for rapid detection of residual cancer on the surface of tumor resection specimens during excisional surgeries, which is known as intraoperative tumor margin assessment. Additionally, rapid assessment of biopsy specimens at the point-of-care could enable clinicians to confirm that a suspicious lesion is successfully sampled, thus preventing an unnecessary repeat biopsy procedure. Rapid and low cost histological processing could also be potentially useful in settings lacking the human resources and equipment necessary to perform standard histologic assessment. Lastly, automated interpretation of tissue samples could potentially reduce inter-observer error, particularly in the diagnosis of borderline lesions.

To address these needs, high quality microscopic images of the tissue must be obtained in rapid timeframes, in order for a pathologic assessment to be useful for guiding the intervention. Optical microscopy is a powerful technique to obtain high-resolution images of tissue morphology in real-time at the point of care, without the need for tissue processing. In particular, a number of groups have combined fluorescence microscopy with vital fluorescent stains to visualize micro-anatomical features of thick (i.e. unsectioned or unprocessed) tissue. However, robust methods for segmentation and quantitative analysis of heterogeneous images are essential to enable automated diagnosis. Thus, the goal of this work was to obtain high resolution imaging of tissue morphology through employing fluorescence microscopy and vital fluorescent stains and to develop a quantitative strategy to segment and quantify tissue features in heterogeneous images, such as nuclei and the surrounding stroma, which will enable automated diagnosis of thick tissues.

To achieve these goals, three specific aims were proposed. The first aim was to develop an image processing method that can differentiate nuclei from background tissue heterogeneity and enable automated diagnosis of thick tissue at the point of care. A computational technique called sparse component analysis (SCA) was adapted to isolate features of interest, such as nuclei, from the background. SCA has been used previously in the image processing community for image compression, enhancement, and restoration, but has never been applied to separate distinct tissue types in a heterogeneous image. In combination with a high resolution fluorescence microendoscope (HRME) and a contrast agent acriflavine, the utility of this technique was demonstrated through imaging preclinical sarcoma tumor margins. Acriflavine localizes to the nuclei of cells where it reversibly associates with RNA and DNA. Additionally, acriflavine shows some affinity for collagen and muscle. SCA was adapted to isolate acriflavine positive features or APFs (which correspond to RNA and DNA) from background tissue heterogeneity. The circle transform (CT) was applied to the SCA output to quantify the size and density of overlapping APFs. The sensitivity of the SCA+CT approach to variations in APF size, density and background heterogeneity was demonstrated through simulations. Specifically, SCA+CT achieved the lowest errors for higher contrast ratios and larger APF sizes. When applied to tissue images of excised sarcoma margins, SCA+CT correctly isolated APFs and showed consistently increased density in tumor and tumor + muscle images compared to images containing muscle. Next, variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was further tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. The results indicate that SCA+CT can accurately delineate APFs in heterogeneous tissue, which is essential to enable automated and rapid surveillance of tissue pathology.

Two primary challenges were identified in the work in aim 1. First, while SCA can be used to isolate features, such as APFs, from heterogeneous images, its performance is limited by the contrast between APFs and the background. Second, while it is feasible to create mosaics by scanning a sarcoma tumor bed in a mouse, which is on the order of 3-7 mm in any one dimension, it is not feasible to evaluate an entire human surgical margin. Thus, improvements to the microscopic imaging system were made to (1) improve image contrast through rejecting out-of-focus background fluorescence and to (2) increase the field of view (FOV) while maintaining the sub-cellular resolution needed for delineation of nuclei. To address these challenges, a technique called structured illumination microscopy (SIM) was employed in which the entire FOV is illuminated with a defined spatial pattern rather than scanning a focal spot, such as in confocal microscopy.

Thus, the second aim was to improve image contrast and increase the FOV through employing wide-field, non-contact structured illumination microscopy and optimize the segmentation algorithm for new imaging modality. Both image contrast and FOV were increased through the development of a wide-field fluorescence SIM system. Clear improvement in image contrast was seen in structured illumination images compared to uniform illumination images. Additionally, the FOV is over 13X larger than the fluorescence microendoscope used in aim 1. Initial segmentation results of SIM images revealed that SCA is unable to segment large numbers of APFs in the tumor images. Because the FOV of the SIM system is over 13X larger than the FOV of the fluorescence microendoscope, dense collections of APFs commonly seen in tumor images could no longer be sparsely represented, and the fundamental sparsity assumption associated with SCA was no longer met. Thus, an algorithm called maximally stable extremal regions (MSER) was investigated as an alternative approach for APF segmentation in SIM images. MSER was able to accurately segment large numbers of APFs in SIM images of tumor tissue. In addition to optimizing MSER for SIM image segmentation, an optimal frequency of the illumination pattern used in SIM was carefully selected because the image signal to noise ratio (SNR) is dependent on the grid frequency. A grid frequency of 31.7 mm-1 led to the highest SNR and lowest percent error associated with MSER segmentation.

Once MSER was optimized for SIM image segmentation and the optimal grid frequency was selected, a quantitative model was developed to diagnose mouse sarcoma tumor margins that were imaged ex vivo with SIM. Tumor margins were stained with acridine orange (AO) in aim 2 because AO was found to stain the sarcoma tissue more brightly than acriflavine. Both acriflavine and AO are intravital dyes, which have been shown to stain nuclei, skeletal muscle, and collagenous stroma. A tissue-type classification model was developed to differentiate localized regions (75x75 µm) of tumor from skeletal muscle and adipose tissue based on the MSER segmentation output. Specifically, a logistic regression model was used to classify each localized region. The logistic regression model yielded an output in terms of probability (0-100%) that tumor was located within each 75x75 µm region. The model performance was tested using a receiver operator characteristic (ROC) curve analysis that revealed 77% sensitivity and 81% specificity. For margin classification, the whole margin image was divided into localized regions and this tissue-type classification model was applied. In a subset of 6 margins (3 negative, 3 positive), it was shown that with a tumor probability threshold of 50%, 8% of all regions from negative margins exceeded this threshold, while over 17% of all regions exceeded the threshold in the positive margins. Thus, 8% of regions in negative margins were considered false positives. These false positive regions are likely due to the high density of APFs present in normal tissues, which clearly demonstrates a challenge in implementing this automatic algorithm based on AO staining alone.

Thus, the third aim was to improve the specificity of the diagnostic model through leveraging other sources of contrast. Modifications were made to the SIM system to enable fluorescence imaging at a variety of wavelengths. Specifically, the SIM system was modified to enabling imaging of red fluorescent protein (RFP) expressing sarcomas, which were used to delineate the location of tumor cells within each image. Initial analysis of AO stained panels confirmed that there was room for improvement in tumor detection, particularly in regards to false positive regions that were negative for RFP. One approach for improving the specificity of the diagnostic model was to investigate using a fluorophore that was more specific to staining tumor. Specifically, tetracycline was selected because it appeared to specifically stain freshly excised tumor tissue in a matter of minutes, and was non-toxic and stable in solution. Results indicated that tetracycline staining has promise for increasing the specificity of tumor detection in SIM images of a preclinical sarcoma model and further investigation is warranted.

In conclusion, this work presents the development of a combination of tools that is capable of automated segmentation and quantification of micro-anatomical images of thick tissue. When compared to the fluorescence microendoscope, wide-field multispectral fluorescence SIM imaging provided improved image contrast, a larger FOV with comparable resolution, and the ability to image a variety of fluorophores. MSER was an appropriate and rapid approach to segment dense collections of APFs from wide-field SIM images. Variables that reflect the morphology of the tissue, such as the density, size, and shape of nuclei and nucleoli, can be used to automatically diagnose SIM images. The clinical utility of SIM imaging and MSER segmentation to detect microscopic residual disease has been demonstrated by imaging excised preclinical sarcoma margins. Ultimately, this work demonstrates that fluorescence imaging of tissue micro-anatomy combined with a specialized algorithm for delineation and quantification of features is a means for rapid, non-destructive and automated detection of microscopic disease, which could improve cancer management in a variety of clinical scenarios.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The issues surrounding collision of projectiles with structures has gained a high profile since the events of 11th September 2001. In such collision problems, the projectile penetrates the stucture so that tracking the interface between one material and another becomes very complex, especially if the projectile is essentially a vessel containing a fluid, e.g. fuel load. The subsequent combustion, heat transfer and melting and re-solidification process in the structure render this a very challenging computational modelling problem. The conventional approaches to the analysis of collision processes involves a Lagrangian-Lagrangian contact driven methodology. This approach suffers from a number of disadvantages in its implementation, most of which are associated with the challenges of the contact analysis component of the calculations. This paper describes a 'two fluid' approach to high speed impact between solid structures, where the objective is to overcome the problems of penetration and re-meshing. The work has been carried out using the finite volume, unstructured mesh multi-physics code PHYSICA+, where the three dimensional fluid flow, free surface, heat transfer, combustion, melting and re-solidification algorithms are approximated using cell-centred finite volume, unstructured mesh techniques on a collocated mesh. The basic procedure is illustrated for two cases of Newtonian and non-Newtonian flow to test various of its component capabilities in the analysis of problems of industrial interest.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sub-chronic administration of phencyclidine to the rat brings about enduring pathophysiological and cognitive changes that resemble some features of schizophrenia. The present study aimed to determine whether the behavioural consequence of this phencyclidine regime extends to a long-term disruption of social interaction that might provide a parallel with some negative symptoms of the disease. Rats were treated with phencyclidine (2mg/kg bi-daily for 1 week) or vehicle followed by a drug-free period. Social interaction was assessed 24h, 1 week, 3 weeks and 6 weeks post-treatment. A long-lasting disturbance of social behaviour was observed in the phencyclidine group, namely more contact and non-contact interaction with an unfamiliar target rat at all time points. Six weeks post-phencyclidine, analysis of brains showed a reduction in expression of parvalbumin immunoreactive neurons in the hippocampus with significant reductions localised to the CA1 and dentate gyrus regions. These results show that sub-chronic phencyclidine produces long-lasting disruptions in social interaction that, however, do not model the social withdrawal seen in patients with schizophrenia. These disturbances of social behaviour may be associated with concurrent pathophysiological brain changes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper the use of eigenvalue stability analysis of very large dimension aeroelastic numerical models arising from the exploitation of computational fluid dynamics is reviewed. A formulation based on a block reduction of the system Jacobian proves powerful to allow various numerical algorithms to be exploited, including frequency domain solvers, reconstruction of a term describing the fluid–structure interaction from the sparse data which incurs the main computational cost, and sampling to place the expensive samples where they are most needed. The stability formulation also allows non-deterministic analysis to be carried out very efficiently through the use of an approximate Newton solver. Finally, the system eigenvectors are exploited to produce nonlinear and parameterised reduced order models for computing limit cycle responses. The performance of the methods is illustrated with results from a number of academic and large dimension aircraft test cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este estudo pretende ser uma contribuição para a resposta aos apelos desafiadores de diversos autores e organizações internacionais, em particular das Nações Unidas, ao reconhecerem a importância da educação para a sustentabilidade através da proposta da Década da Educação para o Desenvolvimento Sustentável (2005-2014). A formação contínua de professores desenvolvida em comunidades de aprendizagem interdisciplinares, constituídas por professores de Ciências e de Filosofia, foi assumida nesta investigação com um instrumento que pode potenciar o incremento da interdisciplinaridade entre estas áreas do saber e promover o desenvolvimento profissional dos professores. O estudo realizado centra-se em três fases: Fase I – Diagnóstico de concepções de professores de Ciências e de Filosofia acerca da pertinência que atribuem às interacções intergrupais (Ciências/Filosofia) como contributo para a melhoria das suas práticas pedagógicas e das aprendizagens dos alunos no âmbito da educação para a sustentabilidade. Fase II – Concepção e implementação de um programa de formação contínua numa comunidade de aprendizagem interdisciplinar, constituída por professores de Ciências e de Filosofia de uma escola. Fase III – Avaliação das percepções dos professores/formandos sobre os impactes do programa de formação no incremento da interdisciplinaridade, na leccionação da temática Sustentabilidade na Terra, no desenvolvimento profissional dos participantes e nas práticas de formação contínua de professores. A Fase I incide sobre o diagnóstico de concepções de professores de Ciências e de Filosofia, para o qual foi concebido um questionário, aplicado em dezoito escolas do Ex-CAE de Viseu, e ao qual responderam 185 professores. Os indicadores obtidos revelam que os professores de Ciências e de Filosofia reconhecem a relevância das suas próprias áreas disciplinares para a formação dos alunos. No entanto, a interdisciplinaridade entre as Ciências e a Filosofia é escassa, apesar dos professores considerarem que pode ser útil na planificação das actividades lectivas. Consideram, também, que pode promover a implementação de estratégias de ensino mais diversificadas e contribuir para o incremento de uma cultura de colaboração nas escolas. Os professores de Ciências e de Filosofia reconhecem que a abordagem da temática Sustentabilidade na Terra necessita de conceitos para além dos que são abordados individualmente nas disciplinas leccionadas e pode ser facilitada se forem implementados materiais didácticos construídos com colegas do seu e de outros grupos disciplinares. A Fase II parte das concepções diagnosticadas, dos indicadores da investigação em Didáctica, das actuais perspectivas de ensino e de aprendizagem preconizadas para os Ensinos Básico e Secundário e de um modelo de formação reflexiva, crítica e ecológica (Bronfenbrenner, 1979; Alarcão, 1996; Sá-Chaves, 1997; Schön, 2000) e elabora-se um programa de formação contínua de professores de cariz interdisciplinar (Ciências e Filosofia). Este programa de formação foi implementado a vinte e quatro professores de Ciências e de Filosofia de uma Escola Secundária da região Centro-Norte de Portugal, durante o ano lectivo de 2008/2009 e teve a duração de cinquenta horas presenciais e cinquenta horas não presenciais. Nele aprofundam-se conhecimentos no âmbito da educação em Ciência e em Filosofia, promovendo a ligação entre ambas e facilitando, deste modo, a sua transposição didáctica. Promove-se a articulação entre a investigação e as práticas pedagógicas e proporciona-se a análise e o aprofundamento de temáticas transversais às Ciências e à Filosofia. Procura-se, deste modo, potenciar a especificidade das áreas do saber envolvidas e promover o enriquecimento de perspectivas nos participantes. Das dinâmicas estabelecidas emergem percursos formativos que permitem a construção de materiais didácticos, estruturados numa perspectiva construtivista de cariz interdisciplinar, para a temática Sustentabilidade da Terra, leccionada nas disciplinas de Ciências (Ensino Básico e Secundário) e de Filosofia (Ensino Secundário). Estes materiais didácticos são, posteriormente, implementados pelos professores de Ciências e Filosofia no contexto de sala de aula. Na Fase III desta investigação avaliam-se as percepções sobre os impactes do programa de formação no incremento da interdisciplinaridade entre os professores de Ciências e de Filosofia, na leccionação da temática Sustentabilidade na Terra, no desenvolvimento profissional dos professores que nele participaram e na melhoria das práticas de formação contínua de professores. Os indicadores obtidos apontam no sentido de que o processo formativo experienciado contribuiu para: - ajudar a derrubar barreiras disciplinares existentes entre os professores de Ciências e de Filosofia; - a construção, numa lógica interdisciplinar, de materiais didácticos diversificados para a temática Sustentabilidade na Terra, que foram reconhecidos pelos alunos como inovadores e importantes para a vivência de aprendizagens activas e contextualizadas; - a modificação de algumas práticas pedagógicas dos professores participantes; - a identificação de potencialidades das comunidades de aprendizagem interdisciplinares na formação contínua de professores. As conclusões obtidas nesta investigação levam a considerar que há necessidade de serem trilhados novos caminhos no campo da formação contínua de professores, procurando criar mecanismos de trabalho e de cooperação que permitam uma efectiva partilha de saberes e de valores entre professores de diferentes áreas disciplinares, que informem novas atitudes, reais e consentâneas com uma prática pedagógica reflexiva e interdisciplinar. Deste modo, considera-se que a adopção de um modelo reflexivo de formação contínua de professores, baseado na constituição de comunidades de aprendizagem interdisciplinares, ajuda os professores a terem uma visão mais integradora dos saberes e a reconhecerem as potencialidades da interdisciplinaridade entre as Ciências e a Filosofia na melhoria das práticas pedagógicas. Pode, também, constituir-se como resposta aos desafios da educação no século XXI, facilitando o exercício de uma cidadania de responsabilidade e participativa e apontando perspectivas para a resolução de problemas da sociedade actual, entre os quais se incluem os relacionados com a sustentabilidade do planeta Terra.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Infrared thermography is a non-invasive technique that measures mid to long-wave infrared radiation emanating from all objects and converts this to temperature. As an imaging technique, the value of modern infrared thermography is its ability to produce a digitized image or high speed video rendering a thermal map of the scene in false colour. Since temperature is an important environmental parameter influencing animal physiology and metabolic heat production an energetically expensive process, measuring temperature and energy exchange in animals is critical to understanding physiology, especially under field conditions. As a non-contact approach, infrared thermography provides a non-invasive complement to physiological data gathering. One caveat, however, is that only surface temperatures are measured, which guides much research to those thermal events occurring at the skin and insulating regions of the body. As an imaging technique, infrared thermal imaging is also subject to certain uncertainties that require physical modeling, which is typically done via built-in software approaches. Infrared thermal imaging has enabled different insights into the comparative physiology of phenomena ranging from thermogenesis, peripheral blood flow adjustments, evaporative cooling, and to respiratory physiology. In this review, I provide background and guidelines for the use of thermal imaging, primarily aimed at field physiologists and biologists interested in thermal biology. I also discuss some of the better known approaches and discoveries revealed from using thermal imaging with the objective of encouraging more quantitative assessment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Photothermal effect refers to heating of a sample due to the absorption of electromagnetic radiation. Photothermal (PT) heat generation which is an example of energy conversion has in general three kinds of applications. 1. PT material probing 2. PT material processing and 3. PT material destruction. The temperatures involved increases from 1-. 3. Of the above three, PT material probing is the most important in making significant contribution to the field of science and technology. Photothermal material characterization relies on high sensitivity detection techniques to monitor the effects caused by PT material heating of a sample. Photothermal method is a powerful high sensitivity non-contact tool used for non-destructive thermal characterization of materials. The high sensitivity of the photothermal methods has led to its application for analysis of low absorbance samples. Laser calorimetry, photothermal radiometry, pyroelectric technique, photoacoustic technique, photothermal beam deflection technique, etc. come under the broad class ofphotothermal techniques. However the choice of a suitable technique depends upon the nature of the sample, purpose of measurement, nature of light source used, etc. The present investigations are done on polymer thin films employing photothermal beam deflection technique, for the successful determination of their thermal diffusivity. Here the sample is excited by a He-Ne laser (A = 6328...\ ) which acts as the pump beam. Due to the refractive index gradient established in the sample surface and in the adjacent coupling medium, another optical beam called probe beam (diode laser, A= 6500A ) when passed through this region experiences a deflection and is detected using a position sensitive detector and its output is fed to a lock-in amplifier from which the amplitude and phase of the deflection can be directly obtained. The amplitude and phase of the signal is suitably analysed for determining the thermal diffusivity.The production of polymer thin film samples has gained considerable attention for the past few years. Plasma polymerization is an inexpensive tool for fabricating organic thin films. It refers to formation of polymeric materials under the influence of plasma, which is generated by some kind of electric discharge. Here plasma of the monomer vapour is generated by employing radio frequency (MHz) techniques. Plasma polymerization technique results in homogeneous, highly adhesive, thermally stable, pinhole free, dielectric, highly branched and cross-linked polymer films. The possible linkage in the formation of the polymers is suggested by comparing the FTIR spectra of the monomer and the polymer.Near IR overtone investigations on some organic molecules using local mode model are also done. Higher vibrational overtones often provide spectral simplification and greater resolution of peaks corresponding to nonequivalent X-H bonds where X is typically C, N or O. Vibrational overtone spectroscopy of molecules containing X-H oscillators is now a well established tool for molecular investigations. Conformational and steric differences between bonds and structural inequivalence ofCH bonds (methyl, aryl, acetylenic, etc.) are resolvable in the higher overtone spectra. The local mode model in which the X-H oscillators are considered to be loosely coupled anharmonic oscillators has been widely used for the interpretation of overtone spectra. If we are exciting a single local oscillator from the vibrational ground state to the vibrational state v, then the transition energy of the local mode overtone is given by .:lE a......v = A v + B v2 • A plot of .:lE / v versus v will yield A, the local mode frequency as the intercept and B, the local mode diagonal anharmonicity as the slope. Here A - B gives the mechanical frequency XI of the oscillator and B = X2 is the anharmonicity of the bond. The local mode parameters XI and X2 vary for non-equivalent X-H bonds and are sensitive to the inter and intra molecular environment of the X-H oscillator.