946 resultados para file it easy
Resumo:
PURPOSE. To better understand the relative contributions of rod, cone, and melanopsin to the human pupillary light reflex (PLR) and to determine the optimal conditions for assessing the health of the rod, cone, and melanopsin pathways with a relatively brief clinical protocol. METHODS. PLR was measured with an eye tracker, and stimuli were controlled with a Ganzfeld system. In experiment 1, 2.5 log cd/m(2) red (640 +/- 10 nm) and blue (467 +/- 17 nm) stimuli of various durations were presented after dark adaptation. In experiments 2 and 3, 1-second red and blue stimuli were presented at different intensity levels in the dark (experiment 2) or on a 0.78 log cd/m(2) blue background (experiment 3). Based on the results of experiments 1 to 3, a clinical protocol was designed and tested on healthy control subjects and patients with retinitis pigmentosa and Leber`s congenital amaurosis. RESULTS. The duration for producing the optimal melanopsin-driven sustained pupil response after termination of an intense blue stimulus was 1 second. PLR rod-and melanopsin-driven components are best studied with low-and high-intensity flashes, respectively, presented in the dark (experiment 2). A blue background suppressed rod and melanopsin responses, making it easy to assess the cone contribution with a red flash (experiment 3). With the clinical protocol, robust melanopsin responses could be seen in patients with few or no contributions from the rods and cones. CONCLUSIONS. It is possible to assess the rod, cone, and melanopsin contributions to the PLR with blue flashes at two or three intensity levels in the dark and one red flash on a blue background. (Invest Ophthalmol Vis Sci. 2011; 52: 6624-6635) DOI: 10.1167/iovs.11-7586
Resumo:
Software visualizations can provide a concise overview of a complex software system. Unfortunately, as software has no physical shape, there is no `natural' mapping of software to a two-dimensional space. As a consequence most visualizations tend to use a layout in which position and distance have no meaning, and consequently layout typically diverges from one visualization to another. We propose an approach to consistent layout for software visualization, called Software Cartography, in which the position of a software artifact reflects its vocabulary, and distance corresponds to similarity of vocabulary. We use Latent Semantic Indexing (LSI) to map software artifacts to a vector space, and then use Multidimensional Scaling (MDS) to map this vector space down to two dimensions. The resulting consistent layout allows us to develop a variety of thematic software maps that express very different aspects of software while making it easy to compare them. The approach is especially suitable for comparing views of evolving software, as the vocabulary of software artifacts tends to be stable over time. We present a prototype implementation of Software Cartography, and illustrate its use with practical examples from numerous open-source case studies.
Resumo:
Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nonoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamOn clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these slusters starting from Ga2O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental result of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, rama, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolutioni, the rama cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark calculations and to keep up with the complexity of systems under study, rama has been expanded to a total of sixty four processors. Interest in the non-covalent intereaction of DNA with carbon nanotubes has steadily increased during past several years. This hybrid system, at the junction of the biological regime and the nanomaterials world, possesses features which make it very attractive for a wide range of applicatioins. Using the in-house computational power available, we have studied details of the interaction between nucleic acid bases with graphene sheet as well as high-curvature small-diameter carbon nanotube. The calculated trend in the binding energies strongly suggests that the polarizability of the base molecules determines the interaction strength of the nucleic acid bases with graphene. When comparing the results obtained here for physisorption on the small diameter nanotube considered with those from the study on graphene, it is observed that the interaction strength of nucleic acid bases is smaller for the tube. Thus, these results show that the effect of introducing curvature is to reduce the binding energy. The binding energies for the two extreme cases of negligible curvature (i.e. flat graphene sheet) and of very high curvature (i.e. small diameter nanotube) may be considered as upper and lower bounds. This finding represents an important step towards a better understanding of experimentally observed sequence-dependent interaction of DNA with Carbon nanotubes.
Resumo:
This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.
Resumo:
BACKGROUND: This study is part of a cross-sectional evaluation of complementary medicine providers in primary care in Switzerland. It compares patient satisfaction with anthroposophic medicine (AM) and conventional medicine (CON). METHODS: We collected baseline data on structural characteristics of the physicians and their practices and health status and demographics of the patients. Four weeks later patients assessed their satisfaction with the received treatment (five items, four point rating scale) and evaluated the praxis care (validated 23-item questionnaire, five point rating scale). 1946 adult patients of 71 CON and 32 AM primary care physicians participated. RESULTS: 1. Baseline characteristics: AM patients were more likely female (75.6% vs. 59.0%, p < 0.001) and had higher education (38.6% vs. 24.7%, p < 0.001). They suffered more often from chronic illnesses (52.8% vs. 46.2%, p = 0.015) and cancer (7.4% vs. 1.1%). AM consultations lasted on average 23,3 minutes (CON: 16,8 minutes, p < 0.001). 2. Satisfaction: More AM patients expressed a general treatment satisfaction (56.1% vs. 43.4%, p < 0.001) and saw their expectations completely fulfilled at follow-up (38.7% vs. 32.6%, p < 0.001). AM patients reported significantly fewer adverse side effects (9.3% vs. 15.4%, p = 0.003), and more other positive effects from treatment (31.7% vs. 17.1%, p < 0.001). Europep: AM patients appreciated that their physicians listened to them (80.0% vs. 67.1%, p < 0.001), spent more time (76.5% vs. 61.7%, p < 0.001), had more interest in their personal situation (74.6% vs. 60.3%, p < 0.001), involved them more in decisions about their medical care (67.8% vs. 58.4%, p = 0.022), and made it easy to tell the physician about their problems (71.6% vs. 62.9%, p = 0.023). AM patients gave significantly better rating as to information and support (in 3 of 4 items p [less than or equal to] 0.044) and for thoroughness (70.4% vs. 56.5%, p < 0.001). CONCLUSION: AM patients were significantly more satisfied and rated their physicians as valuable partners in the treatment. This suggests that subject to certain limitations, AM therapy may be beneficial in primary care. To confirm this, more detailed qualitative studies would be necessary.
Resumo:
Software visualizations can provide a concise overview of a complex software system. Unfortunately, since software has no physical shape, there is no “natural“ mapping of software to a two-dimensional space. As a consequence most visualizations tend to use a layout in which position and distance have no meaning, and consequently layout typical diverges from one visualization to another. We propose a consistent layout for software maps in which the position of a software artifact reflects its \emph{vocabulary}, and distance corresponds to similarity of vocabulary. We use Latent Semantic Indexing (LSI) to map software artifacts to a vector space, and then use Multidimensional Scaling (MDS) to map this vector space down to two dimensions. The resulting consistent layout allows us to develop a variety of thematic software maps that express very different aspects of software while making it easy to compare them. The approach is especially suitable for comparing views of evolving software, since the vocabulary of software artifacts tends to be stable over time.
Resumo:
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX (www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software’s modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth.DOI: http://dx.doi.org/10.7554/eLife.05864.001Author keywordsResearch organism
Resumo:
Las ideas de Berra sobre la enseñanza, que tendían a hacer imperar en la educación una rígida metodología naturalista, ejercieron su influencia sobre la educación argentina, aún antes de su actuación en el gobierno escolar de la provincia de Buenos Aires. A través de su archivo documental puede conocerse la vida intelectual rioplatense -en ambas orillas-, así como el mundo estrictamente escolar y sus manifestaciones pedagógicas correspondientes. En este trabajo, nos proponemos estudiar su actuación en el Uruguay entre los años 1874 a 1882, cuya elección no es casual. Este período comprende el inicio de su actividad pedagógica en Uruguay, la elaboración de su obra principal ("Los apuntes de pedagogía") y su participación en el Congreso pedagógico de Buenos Aires, en pleno embate entre liberales y clericales y las polémicas desatadas por el avance del positivismo.
Resumo:
Las ideas de Berra sobre la enseñanza, que tendían a hacer imperar en la educación una rígida metodología naturalista, ejercieron su influencia sobre la educación argentina, aún antes de su actuación en el gobierno escolar de la provincia de Buenos Aires. A través de su archivo documental puede conocerse la vida intelectual rioplatense -en ambas orillas-, así como el mundo estrictamente escolar y sus manifestaciones pedagógicas correspondientes. En este trabajo, nos proponemos estudiar su actuación en el Uruguay entre los años 1874 a 1882, cuya elección no es casual. Este período comprende el inicio de su actividad pedagógica en Uruguay, la elaboración de su obra principal ("Los apuntes de pedagogía") y su participación en el Congreso pedagógico de Buenos Aires, en pleno embate entre liberales y clericales y las polémicas desatadas por el avance del positivismo.
Resumo:
Las ideas de Berra sobre la enseñanza, que tendían a hacer imperar en la educación una rígida metodología naturalista, ejercieron su influencia sobre la educación argentina, aún antes de su actuación en el gobierno escolar de la provincia de Buenos Aires. A través de su archivo documental puede conocerse la vida intelectual rioplatense -en ambas orillas-, así como el mundo estrictamente escolar y sus manifestaciones pedagógicas correspondientes. En este trabajo, nos proponemos estudiar su actuación en el Uruguay entre los años 1874 a 1882, cuya elección no es casual. Este período comprende el inicio de su actividad pedagógica en Uruguay, la elaboración de su obra principal ("Los apuntes de pedagogía") y su participación en el Congreso pedagógico de Buenos Aires, en pleno embate entre liberales y clericales y las polémicas desatadas por el avance del positivismo.
Resumo:
Los polímeros armados con fibras (FRP) se utilizan en refuerzos de estructuras de hormigón debido sobre todo a sus excelentes propiedades mecánicas, su resistencia a la corrosión y a su ligereza que se traduce en facilidad y ahorro en el transporte, puesta en obra y aplicación, la cual se realiza de forma muy rápida, con pocos operarios y utilizando medios auxiliares ligeros, minimizándose las interrupciones del uso de la estructura y las molestias a los usuarios. Las razones presentadas anteriormente, han despertado un gran inter´es por parte de diferentes grupos de investigación a nivel mundial y que actualmente se encuentran desarrollando nuevas técnicas de aplicación y métodos de cálculo. Sin embargo, las investigaciones realizadas hasta la fecha, muestran un procedimiento bien definido y aceptado en lo referente al cálculo a flexión, lo cual no ocurre con el refuerzo a cortante y aunque se ha demostrado que el refuerzo con FRP es un sistema eficaz para incrementar la capacidad ´ultima frente a esfuerzos cortantes, también se pone de manifiesto la necesidad de más estudios experimentales y teóricos para avanzar en el entendimiento de los mecanismos involucrados para este tipo de refuerzo y establecer un procedimiento de diseño apropiado que maximice las excelentes propiedades de este material. Los modelos que explican el comportamiento del refuerzo a cortante de elementos de hormigón armado son complejos y sin transposición directa a fórmulas ingenieriles. Las normas actualmente en vigor, generalmente, establecen empíricamente la capacidad cortante como la suma de las capacidades del hormigón y el refuerzo transversal de acero. Cuando un elemento es reforzado externamente con FRP, los modelos son evidentemente aun más complejos. Las guías y recomendaciones existentes proponen calcular la capacidad del elemento añadiendo la resistencia aportada por el refuerzo externo de FRP a la ya dada por el hormigón y acero transversal. Sin embargo, la idoneidad de este acercamiento es cuestionable puesto que no tiene en cuenta una posible interacción entre refuerzos. Con base en lo anterior se da origen al tema objeto de este trabajo, el cual está orientado al estudio a cortante de elementos de hormigón armado (HA), reforzados externamente con material compuesto de tejido unidireccional de fibra de carbono y resina epoxi. Inicialmente se hace una completa revisión del estado actual del conocimiento de la resistencia a cortante en elementos de hormigón armado con y sin refuerzo externo de FRP, prestando especial atención en los mecanismos actuantes estudiados hasta la fecha. La bibliografía consultada ha sido exhaustiva y actualizada lo que ha permitido el estudio de los modelos propuestos más importantes, tanto para la descripción del fenómeno de adherencia entre hormigón-FRP como de la valoración del aporte al cortante total hecho por el FRP, a través de sendas bases de datos de ensayos de pull-out y de vigas de hormigón armado ensayadas a cortante. Con base en todo lo anterior, se expusieron los mecanismos actuantes en el aporte a cortante hecho por el FRP en elementos de hormigón armado y la forma como las principales guías de cálculo existentes hasta la fecha los abordan. De igual forma se define un modelo de resistencia de esfuerzos para el FRP y se proponen dos modelos para el cálculo de las tensiones o deformaciones efectivas, de los cuales uno esta basado en el modelo de adherencia propuesto por Oller (2005) y el otro en una regresión multivariante para los mecanismos expuestos. Como complemento del estudio de los trabajos encontrados en la literatura, se lleva acabo un programa experimental que, además de aportar más registros a la exigua base de datos existentes, aporte mayor luz a los puntos que se consideran están deficientemente resueltos. Dentro de este programa se realizaron 32 ensayos sobre 16 vigas de 4.5 m de longitud (dos ensayos por viga), reforzadas a cortante con tejido unidireccional de CFRP. Finalmente, estos estudios han permitido proponer modificaciones a las formulaciones existentes en los códigos y guías en vigor. Abstract Its excellent mechanical properties, as well as its corrosion resistance and light weight, which make it easy to apply and inexpensive to ship to the worksite, are the basis of the extended use of fiber reinforced polymer (FRP) as external strengthening for structures. FRP strengthening is a rapid operation calling for only limited labor and lightweight ancillary equipment, all of which minimizes both the interruption of facility usage and user inconvenience. These advantages have aroused considerable interest in civil engineering science and technology and have led to countless applications the world over. Research studies on the shear strength of FRP-strengthened members have been much fewer in number and more controversial than the research on flexural strengthening, for which a more or less standardized and generally accepted procedure has been established. The research conducted and a host of applications around the world have shown that FRP strengthening is an effective technique for raising ultimate shear strength, but it has also revealed a need for further experimental and theoretical research to advance in the understanding of the mechanisms involved and establish suitable design procedures that optimize the excellent properties of this material The models that explain reinforced concrete (RC) shear strength behavior are complex and cannot be directly transposed to engineering formulas. The standards presently in place generally establish shear capacity empirically as the sum of the capacities of the concrete and the passive reinforcement. When members are externally strengthened with FRP, the models are obviously even more complex. The existing guides and recommendations propose calculating capacity by adding the external strength provided by the FRP to the contributions of the concrete and passive reinforcement. The suitability of this approach is questionable, however, because it fails to consider the interaction between passive reinforcement and external strengthening. The subject of this work is based in above, which is focused on externally shear strengthening for reinforced concrete members with unidirectional carbon fiber sheets bonded with epoxy resin. v Initially a thorough literature review on shear of reinforced concrete beams with and without external FRP strengthening was performed, paying special attention to the acting mechanisms studied to date, which allowed the study of the most important models both to describe the bond phenomenon as well as calculating the FRP shear contribution, through separate databases of pull-out tests and shear tests on reinforced concrete beams externally strengthened with FRP. Based on above, they were exposed the acting mechanisms in a FRP shear strengthening on reinforced concrete beams and how guidelines deal the topic. The same way, it is defined a FRP stress strength model and two more models are proposed for calculating the effective stress, one of these is based on the Oller (2005) bond model and another one is the data best fit, taking into account most of the acting mechanisms. To complement the theoretical part we develop an experimental program that, in addition to providing more records to the meager existing database provide greater understanding to the points considered poorly resolved. The test program included 32 tests of 16 beams (2 per beam) of 4.5 m long, shear strengthened with FRP, externally. Finally, modifications to the existing codes and guidelines are proposed.
Resumo:
In current communication systems, there are many new challenges like various competitive standards, the scarcity of frequency resource, etc., especially the development of personal wireless communication systems result the new system update faster than ever before, the conventional hardware-based wireless communication system is difficult to adapt to this situation. The emergence of SDR enabled the third revolution of wireless communication which from hardware to software and build a flexible, reliable, upgradable, reusable, reconfigurable and low cost platform. The Universal Software Radio Peripheral (USRP) products are commonly used with the GNU Radio software suite to create complex SDR systems. GNU Radio is a toolkit where digital signal processing blocks are written in C++, and connected to each other with Python. This makes it easy to develop more sophisticated signal processing systems, because many blocks already written by others and you can quickly put them together to create a complete system. Although the main function of GNU Radio is not be a simulator, but if there is no RF hardware components,it supports to researching the signal processing algorithm based on pre-stored and generated data by signal generator. This thesis introduced SDR platform from hardware (USRP) and software(GNU Radio), as well as some basic modulation techniques in wireless communication system. Based on the examples provided by GNU Radio, carried out some related experiments, for example GSM scanning and FM radio station receiving on USRP. And make a certain degree of improvement based on the experience of some investigators to observe OFDM spectrum and simulate real-time video transmission. GNU Radio combine with USRP hardware proved to be a valuable lab platform for implementing complex radio system prototypes in a short time. RESUMEN. Software Defined Radio (SDR) es una tecnología emergente que está creando un impacto revolucionario en la tecnología de radio convencional. Un buen ejemplo de radio software son los sistemas de código abierto llamados GNU Radio que emplean un kit de herramientas de desarrollo de software libre. En este trabajo se ha empleado un kit de desarrollo comercial (Ettus Research) que consiste en un módulo de procesado de señal y un hardaware sencillo. El módulo emplea un software de desarrollo basado en Linux sobre el que se pueden implementar aplicaciones de radio software muy variadas. El hardware de desarrollo consta de un microprocesador de propósito general, un dispositivo programable (FPGA) y un interfaz de radiofrecuencia que cubre de 50 a 2200MHz. Este hardware se conecta al PC por medio de un interfaz USB de 8Mb/s de velocidad. Sobre la plataforma de Ettus se pueden ejecutar aplicaciones GNU radio que utilizan principalmente lenguaje de programación Python para implementarse. Sin embargo, su módulo de procesado de señal está construido en C + + y emplea un microprocesador con aritmética de coma flotante. Por lo tanto, los desarrolladores pueden rápida y fácilmente construir aplicaciones en tiempo real sistemas de comunicación inalámbrica de alta capacidad. Aunque su función principal no es ser un simulador, si no puesto que hay componentes de hardware RF, Radio GNU sirve de apoyo a la investigación del algoritmo de procesado de señales basado en pre-almacenados y generados por los datos del generador de señal. En este trabajo fin de máster se ha evaluado la plataforma de hardware de DEG (USRP) y el software (GNU Radio). Para ello se han empleado algunas técnicas de modulación básicas en el sistema de comunicación inalámbrica. A partir de los ejemplos proporcionados por GNU Radio, hemos realizado algunos experimentos relacionados, por ejemplo, escaneado del espectro, demodulación de señales de FM empleando siempre el hardware de USRP. Una vez evaluadas aplicaciones sencillas se ha pasado a realizar un cierto grado de mejora y optimización de aplicaciones complejas descritas en la literatura. Se han empleado aplicaciones como la que consiste en la generación de un espectro de OFDM y la simulación y transmisión de señales de vídeo en tiempo real. Con estos resultados se está ahora en disposición de abordar la elaboración de aplicaciones complejas.
Resumo:
En este proyecto se analizan las características y el ciclo de diseño asociado al entorno de CAD IspLEVER, de Lattice Semiconductor, con la finalidad de evaluar su adecuación a la docencia relacionada con la ingeniería de sistemas digitales cableados. En base a este estudio se realiza una guía del manejo de las diferentes herramientas que se integran en el entorno. Además, se realiza la caracterización de una serie de familias de dispositivos del fabricante Lattice Semiconductor que pudiera servir de apoyo a la hora de elegir un dispositivo de este fabricante para la realización de un determinado diseño. Para dar comienzo a la realización del estudio del entorno y de las herramientas que integra IspLEVER, se procedió a la familiarización con el marco de trabajo. Esta familiarización se realizó, en un principio, a través de la lectura de la documentación ofrecida por el fabricante en su página web, http://www.latticesemi.com. Tras esta lectura, que sirvió para tener una primera visión de las características de la herramienta, se procedió a la descarga del paquete de instalación; el fabricante ofrece una versión de evaluación que expira a los 12 meses. Una vez descargado, se instaló y para terminar con los preparativos, se pasó el procedimiento de obtención de la licencia. Con ello se consiguió tener el software preparado para su utilización. A continuación se emplearon horas de trabajo para, sin documentación alguna, tratar de crear diseños; con este trabajo se pretendía detectar lo intuitivo que resulta el entorno cuando se tienen conocimientos de herramientas de CAD electrónico. Tras esta primera toma de contacto con el entorno real, se procedió al estudio de las diferentes opciones que ofrece para la realización de diseños, ya sean lógicos o físicos. Además del estudio de todas las posibilidades que ofrece el entorno, el trabajo se focalizó en la detección y comparación de las distintas opciones que ofrece para realizar una misma tarea, como ocurre con la asignación de pines o con la revisión de los resultados de una simulación, entre otras. Entrelazado con el estudio de las opciones que ofrece el entorno, se realizó el estudio de las distintas herramientas de trabajo integradas en el mismo. Una vez estudiado el entorno y las herramientas, se procedió a la realización del tutorial. Se capturaron todas las imágenes que se consideraron apropiadas para que al alumno le resultase cómodo y fácil seguir todas las indicaciones que el tutorial ofrece, para la realización de un ciclo de diseño lógico completo. Tras la realización del tutorial, se procedió a revisar la amplia documentación que el fabricante ofrece de cada una de las distintas familias de dispositivos que fabrica. El fin de esta revisión no fue otro que realizar una caracterización de las distintas familias, que pudiera servir de apoyo a la hora de elegir un dispositivo de este fabricante para la realización de un determinado diseño. Este estudio de las familias de dispositivos del fabricante, también se realizó para detectar qué familia de dispositivos era la más idónea para incluir uno de sus miembros en una hipotética placa de prototipado, para la realización de prácticas de laboratorio. ABSTRACT. This project consists in the analysis of the characteristics and the design cycle associated with the IspLEVER environment of CAD, by Lattice Semiconductor. The objective of that analysis is to evaluate their suitability for teaching engineering related to wired digital systems. Based on this analysis a guide was made for managing the different tools that are integrated into the environment. In addition, the characterization of several families by the manufacturer Lattice Semiconductor was made, with the objective that it could be used to support the choice of a Lattice’s device to perform a certain design. To start the IspLEVER environment and tools study, I began with a familiarization with the environment. This familiarization consisted in a study of the manufacturer documentation offered in their web page, http://www.latticesemi.com. After that, I had a general view about the characteristics of the environment and environment tools. Then I continued downloading the installation package. The manufacturer offers an evaluation version that expires in the period of one year. After that download, the environment was installed. Finally, the licensing procedure was followed to finish with the preparations. Then, the software was prepared for its utilization. Following, several work hours were wasted without documentation, trying to create designs. This work has been to identify how intuitive the environment is when you have knowledge of electronic CAD tools. After this first point of contact with the real environment, I proceeded to study different offered options, by the manufacturer, for the realization of either logical or physical designs. In addition to studying all the possibilities offered by the environment, the work is focused on the detection and comparison of the various options offered to perform the same task, as with the pin assignment or reviewing the results of a simulation… At the same time, the environment tools were studied. At this point, I began creating the tutorial. I captured all the figures that I consider important to make it easy to the students. The tutorial contains a complete logical design cycle. When the tutorial was finished, I started to review the manufacturer documentation about each devices family. The purpose of this review was to characterize the different families to support the device selection in future designs. Another purpose of that characterization was focused on the detection of the best family to include one of its members in a prototyping board for conducting laboratory practices.
Resumo:
The International Standard ISO 140-5 on field measurements of airborne sound insulation of façades establishes that the directivity of the measurement loudspeaker should be such that the variation in the local direct sound pressure level (ΔSPL) on the sample is ΔSPL < 5 dB (or ΔSPL < 10 dB for large façades). This condition is usually not very easy to accomplish nor is it easy to verify whether the loudspeaker produces such a uniform level. Direct sound pressure levels on the ISO standard façade essentially depend on the distance and directivity of the loudspeaker used. This paper presents a comprehensive analysis of the test geometry for measuring sound insulation and explains how the loudspeaker directivity, combined with distance, affects the acoustic level distribution on the façade. The first sections of the paper are focused on analysing the measurement geometry and its influence on the direct acoustic level variations on the façade. The most favourable and least favourable positions to minimise these direct acoustic level differences are found, and the angles covered by the façade in the reference system of the loudspeaker are also determined. Then, the maximum dimensions of the façade that meet the conditions of the ISO 140-5 standard are obtained for the ideal omnidirectional sound source and the piston radiating in an infinite baffle, which is chosen as the typical radiation pattern for loudspeakers. Finally, a complete study of the behaviour of different loudspeaker radiation models (such as those usually utilised in the ISO 140-5 measurements) is performed, comparing their radiation maps on the façade for searching their maximum dimensions and the most appropriate radiation configurations.
Resumo:
Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.