934 resultados para SMITH, A. Mark. (2008a). “Alhacen´s Approach to “Alhazen´s Problem””. Arabic Sciences and Philosophy, vol. 18 pp. 143-163.
Resumo:
More than 200 known diseases are transmitted via foods or food products. In the United States, food-borne diseases are responsible for 76 million cases of illness, 32,500 cases of hospitalisation and 5000 cases of death yearly. The ongoing increase in worldwide trade in livestock, food, and food products in combination with increase in human mobility (business- and leisure travel, emigration etc.) will increase the risk of emergence and spreading of such pathogens. There is therefore an urgent need for development of rapid, efficient and reliable methods for detection and identification of such pathogens.
Microchipfabrication has had a major impact on electronics and is expected to have an equally pronounced effect on life sciences. By combining micro-fluidics with micromechanics, micro-optics, and microelectronics, systems can be realized to perform complete chemical or biochemical analyses. These socalled ’Lab-on-a-Chip’ will completely change the face of laboratories in the future where smaller, fully automated devices will be able to perform assays faster, more accurately, and at a lower cost than equipment of today. A general introduction of food safety and applied micro-nanotechnology in life sciences will be given. In addition, examples of DNA micro arrays, micro fabricated integrated PCR chips and total integrated lab-on-achip systems from different National and EU research projects being carried out at the Laboratory of Applied Micro- Nanotechnology (LAMINATE) group at the National Veterinary Institute (DTU-Vet) Technical University of Denmark and the BioLabchip group at the Department of Micro and Nanotechnology (DTU-Nanotech), Technical University of Denmark (DTU), Ikerlan-IK4 (Spain) and other 16 partners from different European countries will be presented.
Resumo:
Pós-graduação em BiofÃsica Molecular - IBILCE
Resumo:
o reconstruct the vegetation and fire history of the Upper Engadine, two continuous sediment cores from Lej da Champfèr and Lej da San Murezzan (Upper Engadine Valley, southeastern Switzerland) were analysed for pollen, plant macrofossils, charcoal and kerogen. The chronologies of the cores are based on 38 radiocarbon dates. Pollen and macrofossil data suggest a rapid afforestation with Betula, Pinus sylvestris, Pinus cembra, and Larix decidua after the retreat of the glaciers from the lake catchments 11,000 cal years ago. This vegetation type persisted until ca. 7300 cal b.p. (5350 b.c.) when Picea replaced Pinus cembra. Pollen indicative of human impact suggests that in this high-mountain region of the central Alps strong anthropogenic activities began during the Early Bronze Age (3900 cal b.p., 1950 b.c.). Local human settlements led to vegetational changes, promoting the expansion of Larix decidua and Alnus viridis. In the case of Larix, continuing land use and especially grazing after fire led to the formation of Larix meadows. The expansion of Alnus viridis was directly induced by fire, as evidenced by time-series analysis. Subsequently, the process of forest conversion into open landscapes continued for millennia and reached its maximum at the end of the Middle Ages at around 500 cal b.p. (a.d. 1450).
Resumo:
La presente Tesis investiga el campo del reconocimiento automático de imágenes mediante ordenador aplicado al análisis de imágenes médicas en mamografÃa digital. Hay un interés por desarrollar sistemas de aprendizaje que asistan a los radiólogos en el reconocimiento de las microcalcificaciones para apoyarles en los programas de cribado y prevención del cáncer de mama. Para ello el análisis de las microcalcificaciones se ha revelado como técnica clave de diagnóstico precoz, pero sin embargo el diseño de sistemas automáticos para reconocerlas es complejo por la variabilidad y condiciones de las imágenes mamográficas. En este trabajo se analizan los planteamientos teóricos de diseño de sistemas de reconocimiento de imágenes, con énfasis en los problemas especÃficos de detección y clasificación de microcalcificaciones. Se ha realizado un estudio que incluye desde las técnicas de operadores morfológicos, redes neuronales, máquinas de vectores soporte, hasta las más recientes de aprendizaje profundo mediante redes neuronales convolucionales, contemplando la importancia de los conceptos de escala y jerarquÃa a la hora del diseño y sus implicaciones en la búsqueda de la arquitectura de conexiones y capas de la red. Con estos fundamentos teóricos y elementos de diseño procedentes de otros trabajos en este área realizados por el autor, se implementan tres sistemas de reconocimiento de mamografÃas que reflejan una evolución tecnológica, culminando en un sistema basado en Redes Neuronales Convolucionales (CNN) cuya arquitectura se diseña gracias al análisis teórico anterior y a los resultados prácticos de análisis de escalas llevados a cabo en nuestra base de datos de imágenes. Los tres sistemas se entrenan y validan con la base de datos de mamografÃas DDSM, con un total de 100 muestras de entrenamiento y 100 de prueba escogidas para evitar sesgos y reflejar fielmente un programa de cribado. La validez de las CNN para el problema que nos ocupa queda demostrada y se propone un camino de investigación para el diseño de su arquitectura. ABSTRACT This Dissertation investigates the field of computer image recognition applied to medical imaging in mammography. There is an interest in developing learning systems to assist radiologists in recognition of microcalcifications to help them in screening programs for prevention of breast cancer. Analysis of microcalcifications has emerged as a key technique for early diagnosis of breast cancer, but the design of automatic systems to recognize them is complicated by the variability and conditions of mammographic images. In this Thesis the theoretical approaches to design image recognition systems are discussed, with emphasis on the specific problems of detection and classification of microcalcifications. Our study includes techniques ranging from morphological operators, neural networks and support vector machines, to the most recent deep convolutional neural networks. We deal with learning theory by analyzing the importance of the concepts of scale and hierarchy at the design stage and its implications in the search for the architecture of connections and network layers. With these theoretical facts and design elements coming from other works in this area done by the author, three mammogram recognition systems which reflect technological developments are implemented, culminating in a system based on Convolutional Neural Networks (CNN), whose architecture is designed thanks to the previously mentioned theoretical study and practical results of analysis conducted on scales in our image database. All three systems are trained and validated against the DDSM mammographic database, with a total of 100 training samples and 100 test samples chosen to avoid bias and stand for a real screening program. The validity of the CNN approach to the problem is demonstrated and a research way to help in designing the architecture of these networks is proposed.
Resumo:
Loftus (Memory & Cognition 6:312-319, 1978) distinguished between interpretable and uninterpretable interactions. Uninterpretable interactions are ambiguous, because they may be due to two additive main effects (no interaction) and a nonlinear relationship between the (latent) outcome variable and its indicator. Interpretable interactions can only be due to the presence of a true interactive effect in the outcome variable, regardless of the relationship that it establishes with its indicator. In the present article, we first show that same problem can arise when an unmeasured mediator has a nonlinear effect on the measured outcome variable. Then we integrate Loftus's arguments with a seemingly contradictory approach to interactions suggested by Rosnow and Rosenthal (Psychological Bulletin 105:143-146, 1989). We show that entire data patterns, not just interaction effects alone, produce interpretable or noninterpretable interactions. Next, we show that the same problem of interpretability can apply to main effects. Lastly, we give concrete advice on what researchers can do to generate data patterns that provide unambiguous evidence for hypothesized interactions.
Resumo:
The neglect of a consideration of history has been a feature of mobility research. ‘History’ affects the results of analyses of social mobility by altering the occupational/industrial structure and by encouraging exchange mobility. Changes in industrial structure are rooted more directly in historical causes and can be seen as more fundamental than changes in occupational structure. Following a substantial review of the secondary literature on changes in industrial and occupational structure in Northern Ireland, loglinear analyses of intra- and intergenerational mobility tables for sociologically-derived cohort generations that incorporate occupational and industrial categories are presented. Structural and inheritance effects for industry are as significant as those for occupation. Given the well-established finding of ‘constant social fludity’ in mobility tables once structural effects are controlled, the inclusion of categorization by industry is necessary in order to reach an accurate understanding of occupational mobility and the role of historical change in mobility.
Resumo:
This article examines ways in which tutors can help adult literacy learners explore and understand the ways in which inequalities in society have implacted on theri lives.
Resumo:
A liquid chromatography-thermospray mass spectrometric assay was developed and validated to confirm the presence of illegal residues of the synthetic androgenic growth promoter, trenbolone acetate, in cattle. The assay was specific for 17alpha-trenbolone, the major bovine metabolite of trenbolone acetate. Methods were developed for the determination of 17alpha-trenbolone in both bile and faeces, the most appropriate matrices for the control of trenbolone acetate abuse. The clean-up.procedure developed relied on enzymatic hydrolysis, followed by sequential liquid-liquid and liquid-solid extraction. The extracts were then subjected to immunoaffinity chromatography. 17alpha-Trenbolone was detected by selected ion monitoring at m/z 271 using positive ion thermospray ionisation. The limit of detection was approximately 0.5 ng/g in faeces and 0.5 ng/ml in bile.
Resumo:
Accurate three-dimensional (3D) models of lumbar vertebrae are required for image-based 3D kinematics analysis. MRI or CT datasets are frequently used to derive 3D models but have the disadvantages that they are expensive, time-consuming or involving ionizing radiation (e.g., CT acquisition). In this chapter, we present an alternative technique that can reconstruct a scaled 3D lumbar vertebral model from a single two-dimensional (2D) lateral fluoroscopic image and a statistical shape model. Cadaveric studies are conducted to verify the reconstruction accuracy by comparing the surface models reconstructed from a single lateral fluoroscopic image to the ground truth data from 3D CT segmentation. A mean reconstruction error between 0.7 and 1.4 mm was found.
Resumo:
Automated identification of vertebrae from X-ray image(s) is an important step for various medical image computing tasks such as 2D/3D rigid and non-rigid registration. In this chapter we present a graphical model-based solution for automated vertebra identification from X-ray image(s). Our solution does not ask for a training process using training data and has the capability to automatically determine the number of vertebrae visible in the image(s). This is achieved by combining a graphical model-based maximum a posterior probability (MAP) estimate with a mean-shift based clustering. Experiments conducted on simulated X-ray images as well as on a low-dose low quality X-ray spinal image of a scoliotic patient verified its performance.
Resumo:
A recent Australian survey of beginning teachers indicates that issue of classroom management continues to be a key concern for early career educators (Australian Education Union, 2007). This finding is supported by the wider literature that identifies managing the classroom, particularly managing behaviour within the classroom, as critical issues for early career teachers (Arends, 2006; Charles, 2004; Groundwater-Smith, Ewing & Le Cornu, 2007). In fact, struggling to manage student behaviour and maintain positive relationships with students are among the top reasons for teachers leaving the teaching profession (Charles, 2004). So, how does a teacher effectively organise and manage up to thirty students learning and behaviour at any one time? The issue of classroom management is a persistent one for all teachers, but is particularly daunting for new teachers. Historically, classrooms were established on strong hierarchical structures that relied heavily on teacher control and authority. However, more recent approaches to managing the classroom are proactive and more collaborative. That is not to say that there exists a single management recipe, far from it. Beginning teachers must view possible approaches to managing the classroom in light of their own beliefs about teaching and learning, their current classroom practice and variables from the context in which they are teaching.