856 resultados para Tilting and cotilting modules
Resumo:
At the first full conference of the European Academy of Occupational Health Psychology (Lund, 1999), the decision was ratified to organise activities around three fora. These together represented the pillars on which the European Academy had been founded that same year: education, research and professional practice. Each forum was convened by a chair person and a small group of full members; it was agreed that a forum meeting would take place at each full conference and working groups would be established to move developments forward between conferences. The forum system has proven an effective means by which to channel the energies of individual members, and the institutions that they represent, towards advancements in all three areas of activity in occupational health psychology (OHP) in Europe. During the meeting of the education forum at the third full European Academy conference (Barcelona, 2001), the proposal was made for the establishment of a working party that would be tasked with the production of a strategy document on The Promotion of Education in Occupational Health Psychology in Europe. The proposal was ratified at the subsequent annual business meeting held during the same conference. The draft outline of the strategy document was published for consultation in the European Academy’s e-newsletter (Vol. 3.1, 2002) and the final document presented to the meeting of the education forum at the fourth full conference (Vienna, 2002). The strategy document constituted a seminal piece of literature in so far as it provided a foundation and structure capable of guiding pan-European developments in education in OHP – developments that would ensure the sustained growth of the discipline and assure it of a long-standing embedded place in both the scholarly and professional domains. To these ends, the strategy document presented six objectives as important for the sustained expansion and the promotion of education in the discipline in Europe. Namely, the development of: [1] A core syllabus for education in occupational health psychology [2] A mechanism for identifying, recognising and listing undergraduate and postgraduate modules and courses (programmes) in occupational health psychology [3] Structures to support the extension of the current provision of education in occupational health psychology [4] Ways of enhancing convergence of the current provision of education in occupational health psychology [5] Ways of encouraging regional cooperation between education providers across the regions of Europe [6] Ways of ensuring consistency with North American developments in education and promoting world wide co-operation in education Five years has elapsed since the presentation of these laudable objectives to the meeting of the education forum in Vienna in December 2002. In that time OHP has undergone considerable growth, particularly in Europe and North America. Expansion has been reflected in the evolution of existing, and emergence of new, representative bodies for the discipline on both sides of the Atlantic Ocean. As such, it might be considered timely to pause to reflect on what has been achieved in respect of each of the objectives set out in the strategy document. The current chapter examines progress on the six objectives and considers what remains to be done. This exercise is entered into not merely in order to congratulate achievements in some areas and lament slow progress in others. Rather, on the one hand it serves to highlight areas where real progress has been made with a view to the presentation of these areas as ripe for further capitalisation. On the other hand it serves to direct the attention of stakeholders (all those with a vested interest in OHP) to those key parts of the jigsaw puzzle that is the development of a self-sustaining pan-European education framework which remain to be satisfactorily addressed.
Resumo:
The Java programming language has potentially significant advantages for wireless sensor nodes but there is currently no feature-rich, open source virtual machine available. In this paper we present Darjeeling, a system comprising offline tools and a memory efficient run-time. The offline post-compiler tool analyzes, links and consolidates Java class files into loadable modules. The runtime implements a modified Java VM that supports multithreading and is designed specifically to operate in constrained execution environments such as wireless sensor network nodes and supports inheritance, threads, garbage collection, and loadable modules. We have demonstrated Java running on AVR128 and MSP430 microcontrollers at speeds of up to 70,000 JVM instructions per second.
Resumo:
The Java programming language enjoys widespread popularity on platforms ranging from servers to mobile phones. While efforts have been made to run Java on microcontroller platforms, there is currently no feature-rich, open source virtual machine available. In this paper we present Darjeeling, a system comprising offline tools and a memory efficient runtime. The offline post-compiler tool analyzes, links and consolidates Java class files into loadable modules. The runtime implements a modified Java VM that supports multithreading and is designed specifically to operate in constrained execution environments such as wireless sensor network nodes. Darjeeling improves upon existing work by supporting inheritance, threads, garbage collection, and loadable modules while keeping memory usage to a minimum. We have demonstrated Java running on AVR128 and MSP430 micro-controllers at speeds of up to 70,000 JVM instructions per second.
Resumo:
This paper proposes a semi-supervised intelligent visual surveillance system to exploit the information from multi-camera networks for the monitoring of people and vehicles. Modules are proposed to perform critical surveillance tasks including: the management and calibration of cameras within a multi-camera network; tracking of objects across multiple views; recognition of people utilising biometrics and in particular soft-biometrics; the monitoring of crowds; and activity recognition. Recent advances in these computer vision modules and capability gaps in surveillance technology are also highlighted.
Resumo:
BACKGROUND The work described in this paper has emerged from an ALTC/OLT funded project, Exploring Intercultural Competency in Engineering. The project indentified many facets of culture and intercultural competence that go beyond a culture-as-nationality paradigm. It was clear from this work that resources were needed to help engineering educators introduce students to the complex issues of culture as they relate to engineering practice. A set of learning modules focussing on intercultural competence in engineering practice were developed early on in the project. Through the OLT project, these modules have been expanded into a range of resources covering various aspects of culture in engineering. Supporting the resources, an eBook detailing the ins and outs of intercultural competency has also been developed to assist engineering educators to embed opportunities for students to develop skills in unpacking and managing cross-cultural challenges in engineering practice. PURPOSE This paper describes the key principles behind the development of the learning modules, the areas they cover and the eBook developed to support the modules. The paper is intended as an introduction to the approaches and resources and extends an invitation to the community to draw from, and contribute to this initial work. DESIGN/METHOD A key aim of this project was to go beyond the culture-as-nationality approach adopted in much of the work around intercultural competency (Deardorff, 2011). The eBook explores different dimensions of culture such as workplace culture, culture’s influence on engineering design, and culture in the classroom. The authors describe how these connect to industry practice and explore what they mean for engineering education. The packaged learning modules described here have been developed as a matrix of approaches moving from familiar known methods through complicated activities relying to some extent on expert knowledge. Some modules draw on the concept of ‘complex un-order’ as described in the ‘Cynefin domains’ proposed by Kurtz and Snowden (2003). RESULTS Several of the modules included in the eBook have already been trialled at a variety of institutions. Feedback from staff has been reassuringly positive so far. Further trials are planned for second semester 2012, and version 1 of the eBook and learning modules, Engineering Across Cultures, is due to be released in late October 2012. CONCLUSIONS The Engineering Across Cultures eBook and learning modules provide a useful and ready to employ resource to help educators tackle the complex issue of intercultural competency in engineering education. The book is by no means exhaustive, and nor are the modules, they instead provide an accessible, engineering specific guide to bringing cultural issues into the engineering classroom.
Resumo:
Enterprise Resource Planning (ERP) systems are integrated enterprise-wide standard information systems that automate all aspects of an organisations’ business processes. The ERP philosophy is that business systems incorporating sales, marketing, manufacturing, distribution, personnel and finance modules can be supported by a single integrated system with all of the company’s data captured in a central database. The ERP packages of vendors such as SAP, Baan, J.D. Edwards and Intentia represent more than a common systems platform for a business. They prescribe information blueprints of how organisation’s business processes should operate. In this paper, the scale and strategic importance of ERP systems is identified and the problem of ERP implementation is defined. Five company examples are analysed using a Critical Success Factors (CSFs) theoretical framework. The paper offers a framework for managers which provides the basis for developing an ERP implementation strategy. The case analysis identifies different approaches to ERP implementation, highlights the critical role of legacy systems in influencing the implementation process, and identifies the importance of business process change and software configuration in addition to factors already cited in the literature such as top management support and communication. The implications of the results and future research opportunities are outlined.
Resumo:
The time of the large sequencing projects has enabled unprecedented possibilities of investigating more complex aspects of living organisms. Among the high-throughput technologies based on the genomic sequences, the DNA microarrays are widely used for many purposes, including the measurement of the relative quantity of the messenger RNAs. However, the reliability of microarrays has been strongly doubted as robust analysis of the complex microarray output data has been developed only after the technology had already been spread in the community. An objective of this study consisted of increasing the performance of microarrays, and was measured by the successful validation of the results by independent techniques. To this end, emphasis has been given to the possibility of selecting candidate genes with remarkable biological significance within specific experimental design. Along with literature evidence, the re-annotation of the probes and model-based normalization algorithms were found to be beneficial when analyzing Affymetrix GeneChip data. Typically, the analysis of microarrays aims at selecting genes whose expression is significantly different in different conditions followed by grouping them in functional categories, enabling a biological interpretation of the results. Another approach investigates the global differences in the expression of functionally related groups of genes. Here, this technique has been effective in discovering patterns related to temporal changes during infection of human cells. Another aspect explored in this thesis is related to the possibility of combining independent gene expression data for creating a catalog of genes that are selectively expressed in healthy human tissues. Not all the genes present in human cells are active; some involved in basic activities (named housekeeping genes) are expressed ubiquitously. Other genes (named tissue-selective genes) provide more specific functions and they are expressed preferably in certain cell types or tissues. Defining the tissue-selective genes is also important as these genes can cause disease with phenotype in the tissues where they are expressed. The hypothesis that gene expression could be used as a measure of the relatedness of the tissues has been also proved. Microarray experiments provide long lists of candidate genes that are often difficult to interpret and prioritize. Extending the power of microarray results is possible by inferring the relationships of genes under certain conditions. Gene transcription is constantly regulated by the coordinated binding of proteins, named transcription factors, to specific portions of the its promoter sequence. In this study, the analysis of promoters from groups of candidate genes has been utilized for predicting gene networks and highlighting modules of transcription factors playing a central role in the regulation of their transcription. Specific modules have been found regulating the expression of genes selectively expressed in the hippocampus, an area of the brain having a central role in the Major Depression Disorder. Similarly, gene networks derived from microarray results have elucidated aspects of the development of the mesencephalon, another region of the brain involved in Parkinson Disease.
Resumo:
An automated geo-hazard warning system is the need of the hour. It is integration of automation in hazard evaluation and warning communication. The primary objective of this paper is to explain a geo-hazard warning system based on Internet-resident concept and available cellular mobile infrastructure that makes use of geo-spatial data. The functionality of the system is modular in architecture having input, understanding, expert, output and warning modules. Thus, the system provides flexibility in integration between different types of hazard evaluation and communication systems leading to a generalized hazard warning system. The developed system has been validated for landslide hazard in Indian conditions. It has been realized through utilization of landslide causative factors, rainfall forecast from NASA's TRMM (Tropical Rainfall Measuring Mission) and knowledge base of landslide hazard intensity map and invokes the warning as warranted. The system evaluated hazard commensurate with expert evaluation within 5-6 % variability, and the warning message permeability has been found to be virtually instantaneous, with a maximum time lag recorded as 50 s, minimum of 10 s. So it could be concluded that a novel and stand-alone system for dynamic hazard warning has been developed and implemented. Such a handy system could be very useful in a densely populated country where people are unaware of the impending hazard.
Resumo:
High wind poses a number of hazards in different areas such as structural safety, aviation, and wind energy-where low wind speed is also a concern, pollutant transport, to name a few. Therefore, usage of a good prediction tool for wind speed is necessary in these areas. Like many other natural processes, behavior of wind is also associated with considerable uncertainties stemming from different sources. Therefore, to develop a reliable prediction tool for wind speed, these uncertainties should be taken into account. In this work, we propose a probabilistic framework for prediction of wind speed from measured spatio-temporal data. The framework is based on decompositions of spatio-temporal covariance and simulation using these decompositions. A novel simulation method based on a tensor decomposition is used here in this context. The proposed framework is composed of a set of four modules, and the modules have flexibility to accommodate further modifications. This framework is applied on measured data on wind speed in Ireland. Both short-and long-term predictions are addressed.
Resumo:
The scalability of CMOS technology has driven computation into a diverse range of applications across the power consumption, performance and size spectra. Communication is a necessary adjunct to computation, and whether this is to push data from node-to-node in a high-performance computing cluster or from the receiver of wireless link to a neural stimulator in a biomedical implant, interconnect can take up a significant portion of the overall system power budget. Although a single interconnect methodology cannot address such a broad range of systems efficiently, there are a number of key design concepts that enable good interconnect design in the age of highly-scaled CMOS: an emphasis on highly-digital approaches to solving ‘analog’ problems, hardware sharing between links as well as between different functions (such as equalization and synchronization) in the same link, and adaptive hardware that changes its operating parameters to mitigate not only variation in the fabrication of the link, but also link conditions that change over time. These concepts are demonstrated through the use of two design examples, at the extremes of the power and performance spectra.
A novel all-digital clock and data recovery technique for high-performance, high density interconnect has been developed. Two independently adjustable clock phases are generated from a delay line calibrated to 2 UI. One clock phase is placed in the middle of the eye to recover the data, while the other is swept across the delay line. The samples produced by the two clocks are compared to generate eye information, which is used to determine the best phase for data recovery. The functions of the two clocks are swapped after the data phase is updated; this ping-pong action allows an infinite delay range without the use of a PLL or DLL. The scheme's generalized sampling and retiming architecture is used in a sharing technique that saves power and area in high-density interconnect. The eye information generated is also useful for tuning an adaptive equalizer, circumventing the need for dedicated adaptation hardware.
On the other side of the performance/power spectra, a capacitive proximity interconnect has been developed to support 3D integration of biomedical implants. In order to integrate more functionality while staying within size limits, implant electronics can be embedded onto a foldable parylene (‘origami’) substrate. Many of the ICs in an origami implant will be placed face-to-face with each other, so wireless proximity interconnect can be used to increase communication density while decreasing implant size, as well as facilitate a modular approach to implant design, where pre-fabricated parylene-and-IC modules are assembled together on-demand to make custom implants. Such an interconnect needs to be able to sense and adapt to changes in alignment. The proposed array uses a TDC-like structure to realize both communication and alignment sensing within the same set of plates, increasing communication density and eliminating the need to infer link quality from a separate alignment block. In order to distinguish the communication plates from the nearby ground plane, a stimulus is applied to the transmitter plate, which is rectified at the receiver to bias a delay generation block. This delay is in turn converted into a digital word using a TDC, providing alignment information.
Resumo:
Thermal resistance and thermal rise-time are two basic parameters that affect most of the performances of a laser diode greatly. By measuring waveforms received after a spectroscope at wavelengths varied step-by-step, the spectrally resolved waveforms can be converted to calculate the thermal rise-time. Basic formulas for the spectrum variation of a laser diode and the measurement set-up by using a Boxcar are described in the paper. As an example, the thermal rise-time of a p-side up packaged short-pulse laser diode was measured by the method to be 390 mu s. The method will be useful in characterizing diode lasers and LID modules in high-power applications. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A presente tese é fruto de três anos de trabalho de campo durante os quais foram acompanhados grupos holotrópicos através de sete módulos teóricos e vivenciais do treinamento no trabalho de Respiração Holotrópica e Psicologia Transpessoal do Grof Transpersonal Training. O objetivo central da pesquisa é apresentar uma versão acerca do modo pelo qual se constituiu a prática terapêutica nos referidos grupos. Mediante a associação entre diversos elementos considerados agentes do campo (actantes), foram ganhando forma no set e setting holotrópico, complexos e singulares cenários de fabricação do que foi denominado de "oportunidades para curar", ou seja, situações em que se criaram condições para a emergência de experiências voltadas para o "autoconhecimento" e "transformação pessoal" entre os integrantes dos grupos. A realização da pesquisa ganhou seus contornos por meio da utilização de algumas ferramentas teórico-metodológicas da Teoria Ator-Rede em particular aquelas propostas por Bruno Latour e autores afins - que, durante a elaboração do presente texto-laboratório, entraram em cena como aliados no trabalho de cenarização do processo de fabricação das "oportunidades para curar" nos grupos holotrópicos.
Resumo:
The unique optoelectronic properties of graphene make it an ideal platform for a variety of photonic applications, including fast photodetectors, transparent electrodes in displays and photovoltaic modules, optical modulators, plasmonic devices, microcavities, and ultra-fast lasers. Owing to its high carrier mobility, gapless spectrum and frequency-independent absorption, graphene is a very promising material for the development of detectors and modulators operating in the terahertz region of the electromagnetic spectrum (wavelengths in the hundreds of micrometres), still severely lacking in terms of solid-state devices. Here we demonstrate terahertz detectors based on antenna-coupled graphene field-effect transistors. These exploit the nonlinear response to the oscillating radiation field at the gate electrode, with contributions of thermoelectric and photoconductive origin. We demonstrate room temperature operation at 0.3 THz, showing that our devices can already be used in realistic settings, enabling large-area, fast imaging of macroscopic samples. © 2012 Macmillan Publishers Limited. All rights reserved.
Resumo:
A novel integration method for the production of cost-effective optoelectronic printed circuit boards (OE PCBs) is presented. The proposed integration method allows fabrication of OE PCBs with manufacturing processes common to the electronics industry while enabling direct attachment of electronic components onto the board with solder reflow processes as well as board assembly with automated pick-and-place tools. The OE PCB design is based on the use of polymer multimode waveguides, end-fired optical coupling schemes, and simple electro-optic connectors, eliminating the need for additional optical components in the optical layer, such as micro-mirrors and micro-lenses. A proof-of-concept low-cost optical transceiver produced with the proposed integration method is presented. This transceiver is fabricated on a low-cost FR4 substrate, comprises a polymer Y-splitter together with the electronic circuitry of the transmitter and receiver modules and achieves error-free 10-Gb/s bidirectional data transmission. Theoretical studies on the optical coupling efficiencies and alignment tolerances achieved with the employed end-fired coupling schemes are presented while experimental results on the optical transmission characteristics, frequency response, and data transmission performance of the integrated optical links are reported. The demonstrated optoelectronic unit can be used as a front-end optical network unit in short-reach datacommunication links. © 2011-2012 IEEE.
Resumo:
Coupled Monte Carlo depletion systems provide a versatile and an accurate tool for analyzing advanced thermal and fast reactor designs for a variety of fuel compositions and geometries. The main drawback of Monte Carlo-based systems is a long calculation time imposing significant restrictions on the complexity and amount of design-oriented calculations. This paper presents an alternative approach to interfacing the Monte Carlo and depletion modules aimed at addressing this problem. The main idea is to calculate the one-group cross sections for all relevant isotopes required by the depletion module in a separate module external to Monte Carlo calculations. Thus, the Monte Carlo module will produce the criticality and neutron spectrum only, without tallying of the individual isotope reaction rates. The onegroup cross section for all isotopes will be generated in a separate module by collapsing a universal multigroup (MG) cross-section library using the Monte Carlo calculated flux. Here, the term "universal" means that a single MG cross-section set will be applicable for all reactor systems and is independent of reactor characteristics such as a neutron spectrum; fuel composition; and fuel cell, assembly, and core geometries. This approach was originally proposed by Haeck et al. and implemented in the ALEPH code. Implementation of the proposed approach to Monte Carlo burnup interfacing was carried out through the BGCORE system. One-group cross sections generated by the BGCORE system were compared with those tallied directly by the MCNP code. Analysis of this comparison was carried out and led to the conclusion that in order to achieve the accuracy required for a reliable core and fuel cycle analysis, accounting for the background cross section (σ0) in the unresolved resonance energy region is essential. An extension of the one-group cross-section generation model was implemented and tested by tabulating and interpolating by a simplified σ0 model. A significant improvement of the one-group cross-section accuracy was demonstrated.