908 resultados para systems design
Resumo:
This paper proposes a novel differential mixer topology. The traditional stage of switching is replaced by a stack of NMOS and PMOS transistors combined. A design is given of a 900 MHz down-conversion mixer using a 0.35 μm CMOS process. Comparison with conventional mixer shows that the topology leads to a better performance in terms of conversion gain and linearity. ©2012 IEEE.
Resumo:
We present a generalized test case generation method, called the G method. Although inspired by the W method, the G method, in contrast, allows for test case suite generation even in the absence of characterization sets for the specification models. Instead, the G method relies on knowledge about the index of certain equivalences induced at the implementation models. We show that the W method can be derived from the G method as a particular case. Moreover, we discuss some naturally occurring infinite classes of FSM models over which the G method generates test suites that are exponentially more compact than those produced by the W method.
Resumo:
Investigating parents’ formal engagement opportunities in public schools serves well to characterize the relationship between states and societies. While the relationship between parental involvement and students’ academic success has been thoroughly investigated, rarely has it been seen to indicate countries’ governing regimes. The researcher was curious to see whether and how does parents’ voice differ in different democracies. The hypothesis was that in mature regimes, institutional opportunities for formal parental engagement are plenty and parents are actively involved; while in young democracies there are less opportunities and the engagement is lower. The assumption was also that parental deliberation in expressing their dissatisfaction with schools differs across democracies: where it is more intense, there it translates to higher engagement. Parents’ informedness on relevant regulations and agendas was assumed to be equally average, and their demographic background to have similar effects on engagement. The comparative, most different systems design was employed where public middle schools last graders’ parents in Tartu, Estonia and in Huntsville, Alabama the United States served as a sample. The multidimensional study includes the theoretical review, country and community analyses, institutional analysis in terms of formal parental involvement, and parents’ survey. The findings revealed sizeable differences between parents’ engagement levels in Huntsville and Tartu. The results indicate passivity in both communities, while in Tartu the engagement seems to be alarmingly low. Furthermore, Tartu parents have much less institutional opportunities to engage. In the United States, multilevel efforts to engage parents are visible from local to federal level, in Estonia similar intentions seem to be missing and meaningful parental organizations do not exist. In terms of civic education there is much room for development in both countries. The road will be longer for a young democracy Estonia in transforming its institutional systems from formally democratic to inherently inclusive.
Resumo:
During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.
Resumo:
Library of Congress Subject Headings (LCSH), the standard subject language used in library catalogues, are often criticized for their lack of currency, biased language, and atypical syndetic structure. Conversely, folksonomies (or tags), which rely on the natural language of their users, offer a flexibility often lacking in controlled vocabularies and may offer a means of augmenting more rigid controlled vocabularies such as LCSH. Content analysis studies have demonstrated the potential for folksonomies to be used as a means of enhancing subject access to materials, and libraries are beginning to integrate tagging systems into their catalogues. This study examines the utility of tags as a means of enhancing subject access to materials in library online public access catalogues (OPACs) through usability testing with the LibraryThing for Libraries catalogue enhancements. Findings indicate that while they cannot replace LCSH, tags do show promise for aiding information seeking in OPACs. In the context of information systems design, the study revealed that while folksonomies have the potential to enhance subject access to materials, that potential is severely limited by the current inability of catalogue interfaces to support tag-based searches alongside standard catalogue searches.
Resumo:
OBJECTIVES: To validate the Probability of Repeated Admission (Pra) questionnaire, a widely used self-administered tool for predicting future healthcare use in older persons, in three European healthcare systems. DESIGN: Prospective study with 1-year follow-up. SETTING: Hamburg, Germany; London, United Kingdom; Canton of Solothurn, Switzerland. PARTICIPANTS: Nine thousand seven hundred thirteen independently living community-dwelling people aged 65 and older. MEASUREMENTS: Self-administered eight-item Pra questionnaire at baseline. Self-reported number of hospital admissions and physician visits during 1 year of follow-up. RESULTS: In the combined sample, areas under the receiver operating characteristic curves (AUCs) were 0.64 (95% confidence interval (CI)=0.62-0.66) for the prediction of one or more hospital admissions and 0.68 (95% CI=0.66-0.69) for the prediction of more than six physician visits during the following year. AUCs were similar between sites. In comparison, prediction models based on a person's age and sex alone exhibited poor predictive validity (AUC
Resumo:
High flux and high CRI may be achieved by combining different chips and/or phosphors. This, however, results in inhomogeneous sources that, when combined with collimating optics, typically produce patterns with undesired artifacts. These may be a combination of spatial, angular or color non-uniformities. In order to avoid these effects, there is a need to mix the light source, both spatially and angularly. Diffusers can achieve this effect, but they also increase the etendue (and reduce the brightness) of the resulting source, leading to optical systems of increased size and wider emission angles. The shell mixer is an optic comprised of many lenses on a shell covering the source. These lenses perform Kohler integration to mix the emitted light, both spatially and angularly. Placing it on top of a multi-chip Lambertian light source, the result is a highly homogeneous virtual source (i.e, spatially and angularly mixed), also Lambertian, which is located in the same position with essentially the same size (so the average brightness is not increased). This virtual light source can then be collimated using another optic, resulting in a homogeneous pattern without color separation. Experimental measurements have shown optical efficiency of the shell of 94%, and highly homogeneous angular intensity distribution of collimated beams, in good agreement with the ray-tracing simulations.
Resumo:
Two quasi-aplanatic free-form solid V-groove collimators are presented in this work. Both optical designs are originally designed using the Simultaneous Multiple Surface method in three dimensions (SMS 3D). The second optically active surface in both free-form V-groove devices is designed a posteriori as a grooved surface. First two mirror (XX) design is designed in order to clearly show the design procedure and working principle of these devices. Second, RXI free-form design is comparable with existing RXI collimators; it is a compact and highly efficient design made of polycarbonate (PC) performing very good colour mixing of the RGGB LED sources placed off-axis. There have been presented rotationally symmetric non-aplanatic high efficiency collimators with colour mixing property to be improved and rotationally symmetric aplanatic devices with good colour mixing property and efficiency to be improved. The aim of this work was to design a free-form device in order to improve colour mixing property of the rotationally symmetric nonaplanatic RXI devices and the efficiency of the aplanatic ones.
Resumo:
The previous publications (Miñano et al, 2011) have shown that using a Spherical Geodesic Waveguide (SGW), it can be achieved the super-resolution up to ? /500 close to a set of discrete frequencies. These frequencies are directly connected with the well-known Schumann resonance frequencies of spherical symmetric systems. However, the Spherical Geodesic Waveguide (SGW) has been presented as an ideal system, in which the technological obstacles or manufacturing feasibility and their influence on final results were not taken into account. In order to prove the concept of superresolution experimentally, the Spherical Geodesic Waveguide is modified according to the manufacturing requirements and technological limitations. Each manufacturing process imposes some imperfections which can affect the experimental results. Here, we analyze the influence of the manufacturing limitations on the super-resolution properties of the SGW. Beside the theoretical work, herein, there has been presented the experimental results, as well.
Resumo:
In this work, novel imaging designs with a single optical surface (either refractive or reflective) are presented. In some of these designs, both object and image shapes are given but mapping from object to image is obtained as a result of the design. In other designs, not only the mapping is obtained in the design process, but also the shape of the object is found. In the examples considered, the image is virtual and located at infinity and is seen from known pupil, which can emulate a human eye. In the first introductory part, 2D designs have been done using three different design methods: a SMS design, a compound Cartesian oval surface, and a differential equation method for the limit case of small pupil. At the point-size pupil limit, it is proven that these three methods coincide. In the second part, previous 2D designs are extended to 3D by rotation and the astigmatism of the image has been studied. As an advanced variation, the differential equation method is used to provide the freedom to control the tangential rays and sagittal rays simultaneously. As a result, designs without astigmatism (at the small pupil limit) on a curved object surface have been obtained. Finally, this anastigmatic differential equation method has been extended to 3D for the general case, in which freeform surfaces are designed.
Resumo:
Negative Refractive Lens (NRL) has shown that an optical system can produce images with details below the classic Abbe diffraction limit. This optical system transmits the electromagnetic fields, emitted by an object plane, towards an image plane producing the same field distribution in both planes. In particular, a Dirac delta electric field in the object plane is focused without diffraction limit to the Dirac delta electric field in the image plane. Two devices with positive refraction, the Maxwell Fish Eye lens (MFE) and the Spherical Geodesic Waveguide (SGW) have been claimed to break the diffraction limit using positive refraction with a different meaning. In these cases, it has been considered the power transmission from a point source to a point receptor, which falls drastically when the receptor is displaced from the focus by a distance much smaller than the wavelength. Although these systems can detect displacements up to ?/3000, they cannot be compared to the NRL, since the concept of image is different. The SGW deals only with point source and drain, while in the case of the NRL, there is an object and an image surface. Here, it is presented an analysis of the SGW with defined object and image surfaces (both are conical surfaces), similarly as in the case of the NRL. The results show that a Dirac delta electric field on the object surface produces an image below the diffraction limit on the image surface.
Resumo:
Aplanatic designs present great interest in the optics field since they are free from spherical aberration and linear coma at the axial direction. Nevertheless nowadays it cannot be found on literature any thin aplanatic design based on a lens. This work presents the first aplanatic thin lens (in this case a dome-shaped faceted TIR lens performing light collimation), designed for LED illumination applications. This device, due to its TIR structure (defined as an anomalous microstructure as we will see) presents good color-mixing properties. We will show this by means of raytrace simulations, as well as high optical efficiency.
Resumo:
LEDs are substituting fluorescent and incandescent bulbs as illumination sources due to their low power consumption and long lifetime. Visible Light Communications (VLC) makes use of the LEDs short switching times to transmit information. Although LEDs switching speed is around Mbps range, higher speeds (hundred of Mbps) can be reached by using high bandwidth-efficiency modulation techniques. However, the use of these techniques requires a more complex driver which elevates drastically its power consumption. In this work an energy efficiency analysis of the different VLC modulation techniques and drivers is presented. Besides, the design of new schemes of VLC drivers is described.
Resumo:
Como punto de partida para el desarrollo de la Tesis, se mantiene la hipótesis de que es posible establecer métodos de evaluación global sobre el grado de utilidad de los sistemas constructivos correspondientes a los cerramientos de la edificación. Tales métodos habrían de posibilitar, de entre una serie finita de sistemas alternativos, cuáles de ellos son los objetivamente más adecuados para su selección en un entorno de decisión concreto, y habrían de permitir fundamentar la justificación objetiva de tal decisión. Paralelamente a esta hipótesis de carácter general, se planteó desde el inicio la necesidad de comprobación de una hipótesis de partida particular según la cual los sistemas constructivos basados en la utilización de componentes prefabricados, o procesos de puesta en obra con un alto grado de industrialización arrojarían valores de utilidad mayores que los sistemas tradicionales basados en la albañilería. Para la verificación de estas dos hipótesis de partida se ha procedido inicialmente a la selección de un conjunto coherente de doce sistemas de cerramientos de la edificación que pudiese servir como testigo de su diversidad potencial, para proceder a su valoración comparativa. El método de valoración propuesto ha entrado a considerar una serie de factores de diversa índole que no son reducibles a un único parámetro o magnitud que permitiese una valoración de tipo lineal sobre su idoneidad relativa, ni que permitiese establecer un grado de prelación entre los distintos sistemas constructivos alternativos de manera absoluta. Para resolver este tour de force o desafío metodológico se ha acudido a la aplicación de metodologías de valoración que nos permitiesen establecer de forma racional dicha comparativa. Nos referimos a una serie de metodologías provenientes en primera instancia de las ciencias exactas, que reciben la denominación de métodos de ayuda a la decisión multicriterio, y en concreto el denominado método ELECTRE. Inicialmente, se ha planteado la aplicación del método de análisis sobre doce sistemas constructivos seleccionados de tal forma que representasen de forma adecuada las tres categorías establecidas para caracterizar la totalidad de sistemas constructivos posibles; por peso, grado de prefabricación y grado de ventilación. Si bien la combinación de las tres categorías básicas anteriormente señaladas produce un total de 18 subcategorías conceptuales, tomamos finalmente doce subcategorías dado que consideramos que es un número operativo suficiente por extenso para el análisis propuesto y elimina tipos no relevantes. Aplicado el método propuesto, a estos doce sistemas constructivos “testigo”, se constata el mayor grado de utilidad de los sistemas prefabricados, pesados y no ventilados. Al hilo del análisis realizado en la Parte II de la Tesis sobre los doce sistemas constructivos “testigo”, se ha realizado un volcado de los sistemas constructivos incluidos en el Catalogo de Elementos Constructivos del CTE (versión 2010) sobre las dieciocho subcategorías definidas en dicha Parte II para caracterizar los sistemas constructivos “testigo”. Posteriormente, se ha procedido a una parametrización de la totalidad de sistemas constructivos para cerramientos de fachadas incluidos en este Catálogo. La parametrización sistemática realizada ha permitido establecer, mediante el cálculo del valor medio que adoptan los parámetros de los sistemas pertenecientes a una misma familia de las establecidas por el Catálogo, una caracterización comparativa del grado de utilidad de dichas familias, tanto en lo relativo a cada uno de los parámetros como en una valoración global de sus valores, de carácter indicativo. Una vez realizada una parametrización completa de la totalidad de sistemas constructivos incluidos en el Catálogo, se ha realizado una simulación de aplicación de la metodología de validación desarrollada en la Parte II de la presente Tesis, con el objeto de comprobar su adecuación al caso. En conclusión, el desarrollo de una herramienta de apoyo a la decisión multicriterio aplicada al Catálogo de Elementos constructivos del CTE se ha demostrado técnicamente viable y arroja resultados significativos. Se han diseñado dos sistemas constructivos mediante la aplicación de la herramienta desarrollada, uno de fachada no ventilada y otro de fachada ventilada. Comparados estos dos sistemas constructivos mejorados con otros sistemas constructivos analizados Se comprueba el alto grado de utilidad objetiva de los dos sistemas diseñados en relación con el resto. La realización de este ejercicio de diseño de un sistema constructivo específico, que responde a los requerimientos de un decisor concreto viene a demostrar, así pues, la utilidad del algoritmo propuesto en su aplicación a los procesos de diseño de los sistemas constructivos. La tesis incorpora dos innovaciones metodológicas y tres innovaciones instrumentales. ABSTRACT The starting point for the thesis is the hypothesis that it is possible to devise suitability degree evaluation methods of building enclosure systems. Such methods should allow optimizing appraisal, given a specific domain of decision, among a finite number of alternative systems, and provide objective justification of such decision. Along with the above mentioned general assumption, a second hypothesis whereby constructive systems based on the use of prefabricated components, or high industrialization degree work processes, would throw efficiency values higher than traditional masonry systems needed to be tested. In order to validate these two hypothesis a coherent set of twelve enclosure systems that could serve as a reference sample of their potential diversity was selected and a comparative evaluation was carried out. The valuation method proposed has considered several different factors that are neither reducible to a single parameter or magnitude that would allow a linear evaluation of their relative suitability nor allow to establishing an absolute priority ranking between different alternative constructive systems. In order to resolve this tour de force or methodological challenge, valuation methodologies that enable use establishing rational assessments were used. We are referring to a number of methodologies taken from the exact sciences field, usually known as aid methods for multi-criteria decision, in particular the so-called ELECTRE method. Even though the combination of the mentioned three basic categories result in eighteen conceptual sub categories, we are finally considering just twelve since we deem it adequately extended for the our intended purpose and eliminates non relevant instances. The method of analysis was initially applied to the set of twelve selected constructive systems is a way that they could represent adequately the three previously established categories set out to characterize all possible enclosure systems, namely weight, prefabrication degree and ventilation degree. Once the proposed method is applied to the sample systems, the higher efficiency of the prefabricated, heavy and not ventilated systems was confirmed. In line with the analysis in Part II of the thesis on the twelve chosen enclosure systems, it has done an uploading data of construction systems listed in the Catalogue of constructive elements of the CTE (version 2010) according the eighteen subcategories used in this part II to characterize the construction systems taken as sample. Subsequently, a parameterization of all enclosure facade systems included in this catalog has been undertaken. The systematic parameterization has allowed to set, by means of calculating the average values of the parameters of the systems belonging to the same family of those established by the Catalog, a comparative characterization of the efficiency degree of these families, both in relation to each parameter as to an overall evaluation of its values, in a indicative way. After the parameterization of all enclosure systems included in the Catalog, a simulation of validation methodology application developed in Part II of this Thesis has been made, in order to assess its consistency to the referred case. In conclusion, the development of a multi-criteria decision aid tool, applied to the CTE Catalog of constructive elements, has proved to be technically feasible and yields significant results. Two building systems through the application of the developed tool, a non-ventilated façade and a ventilated façade have been designed. Comparing these two improved construction systems with other building systems analyzed, we were able to assess the high degree of objective efficiency of the two systems designed in relation to the rest. The exercise of designing a specific enclosure system that meets the requirements of a particular decision-maker hence shows the suitability of the proposed algorithm applied to the process of enclosure systems design. This Thesis includes two methodological innovations and three instrumental innovations.
Resumo:
These days as we are facing extremely powerful attacks on servers over the Internet (say, by the Advanced Persistent Threat attackers or by Surveillance by powerful adversary), Shamir has claimed that “Cryptography is Ineffective”and some understood it as “Cryptography is Dead!” In this talk I will discuss the implications on cryptographic systems design while facing such strong adversaries. Is crypto dead or we need to design it better, taking into account, mathematical constraints, but also systems vulnerability constraints. Can crypto be effective at all when your computer or your cloud is penetrated? What is lost and what can be saved? These are very basic issues at this point of time, when we are facing potential loss of privacy and security.