897 resultados para market design
Resumo:
Based on a structured literature review, the ceramic tiles sector of Italy (benchmark) and Brazil (2nd world producer and consumer) are compared, under four strategic factors: normative, market, technology and strategic management, in order to identify critical risks for a national strategic sector. The document aims to propose guidelines for a strategic re-planning of the Brazilian ceramic tiles sector, making the Brazilian producers aware of the national market fragility (in spite of its recent remarkable evolution) and helping the policy makers to reflect on the need of reviewing the strategic planning methods and practice, of designing new targeted programs (based on coherence between operation and business strategies), of providing improved management to strengthen the sector against unfair competition by low-cost producers, enhancing the necessary infrastructure in technology, work, marketing and quality management. The analysis is limited to the single-firing production technology. The wide-coverage strategic analysis of the Brazilian ceramic tiles sector, very little studied until now in a scientific way, emphasizes the importance of applying research methodology and may be valuable to both scholars and practitioners. Additionally, it highlights the need of investments in innovation (product design and production technology) and the fundamental role of the sector organization, identifying different dimensions. It is possible to conclude that the recent Brazilian production growth is not due to a natural strengthening because of the hit of the sector and of correct enterprises strategy, but it seems the result of a temporary and favorable economic contingency.
Resumo:
This paper presents an analysis of the capacity of design centric methodologies to prepare engineering students to succeed in the market. Gaps are brainstormed and analyzed with reference to their importance. Reasons that may lead the newly graduated engineers not to succeed right from the beginning of their professional lives have also been evaluated. A comparison among the two subjects above was prepared, reviewed and analyzed. The influence of multidisciplinary, multicultural and complex environmental influences created in the current global business era is taken into account. The industry requirements in terms of what they expect to 'receive' from their engineers are evaluated and compared to the remaining of the study above. An innovative approach to current engineering education that utilizes traditional design-centric methodologies is then proposed, aggregating new disciplines to supplement the traditional engineering education. The solution encompasses the inclusion of disciplines from Human Sciences and Emotional Intelligence fields willing to better prepare the engineer of tomorrow to work in a multidisciplinary, globalized, complex and team working environment. A pilot implementation of such an approach is reviewed and conclusions are drawn from this educational project.
Resumo:
The general combining ability (GCA), specific combining ability (SCA), and heterosis were studied in a complete diallel cross among fresh market tomato breeding lines with reciprocal excluded. Fifteen genotypes (five parents and ten hybrids) were tested using a randomized complete block design, with three replications, and the experiments were conducted in Itatiba, São Paulo state, Brazil, in 2005/06. The yield components evaluated were fruit yield per plant (FP), fruit number per plant (FN), average fruit weight (FW); cluster number per plant (CN); fruit number per cluster (FC), fruit wall thickness (FT) and number of locules per fruit (NL). Fruit quality components evaluated were total soluble solids (SS); total titratable acidity (TA); SS/TA ratio, fruit length (FL); fruit width (WI); length to width ratio (FL/WI). The data for each trait was first subjected to analysis of variance. Griffing's method 2, model 1 was employed to estimate the general (GCA) and specific (SCA) combining abilities. Parental and hybrid data for each trait were used to estimate of mid-parent heterosis. For plant fruit yield, IAC-2 was the best parental line with the highest GCA followed by IAC-4 and IAC-1 lines. The hybrids IAC-1 x IAC-2, IAC-1 x IAC-4 and IAC-2 x IAC-4 showed the highest effects of SCA. High heterotic responses were found for fruit yield and plant fruit number with values up to 49.72% and 47.19%, respectively. The best hybrids for fruit yield and plant fruit number were IAC-1 x IAC-2, IAC-1 x IAC-4 and IAC-2 x IAC-5, for fruit yield and plant fruit number, the main yield components.
Resumo:
Genotypic, phenotypic and environmental correlations were estimated for all possible pairs among eleven characters of tomatoes. Fifteen treatments including five parents and ten hybrids of Instituto Agronômico (IAC) tomato breeding program were evaluated using a randomized complete block experimental design, with tree replications in Itatiba, São Paulo state, Brazil, during 2005/2006. The following traits were evaluated: fruit yield per plant (FP), fruit number per plant (FN), average fruit weight (FW), cluster number per plant (CN), fruit number per cluster (FC), number of locules per fruit (NL), fruit length (FL), fruit width (WI), fruit wall thickness (FT), total soluble solids (SS), and total titratable acidity (TA). The genotypic (rG), phenotypic (rF) and environmental correlations (rA) for two pairs of plant traits were estimated using the Genes© program. High similarity was found among the estimates of genotypic and phenotypic correlations. Positive and high phenotypic and genotypic correlations were observed between FP and the traits FN, FW and FT, and these associations contributed for yield increasing. FW and FT contributed to yield increase and should be considered together as primary yield components in tomato. Positive values of the genotypic and phenotypic correlations revealed that FP influenced FN with high direct effect and significant positive correlation. These traits may be included as the main selection criteria for tomato yield improvement.
Resumo:
Background. The surgical treatment of dysfunctional hips is a severe condition for the patient and a costly therapy for the public health. Hip resurfacing techniques seem to hold the promise of various advantages over traditional THR, with particular attention to young and active patients. Although the lesson provided in the past by many branches of engineering is that success in designing competitive products can be achieved only by predicting the possible scenario of failure, to date the understanding of the implant quality is poorly pre-clinically addressed. Thus revision is the only delayed and reliable end point for assessment. The aim of the present work was to model the musculoskeletal system so as to develop a protocol for predicting failure of hip resurfacing prosthesis. Methods. Preliminary studies validated the technique for the generation of subject specific finite element (FE) models of long bones from Computed Thomography data. The proposed protocol consisted in the numerical analysis of the prosthesis biomechanics by deterministic and statistic studies so as to assess the risk of biomechanical failure on the different operative conditions the implant might face in a population of interest during various activities of daily living. Physiological conditions were defined including the variability of the anatomy, bone densitometry, surgery uncertainties and published boundary conditions at the hip. The protocol was tested by analysing a successful design on the market and a new prototype of a resurfacing prosthesis. Results. The intrinsic accuracy of models on bone stress predictions (RMSE < 10%) was aligned to the current state of the art in this field. The accuracy of prediction on the bone-prosthesis contact mechanics was also excellent (< 0.001 mm). The sensitivity of models prediction to uncertainties on modelling parameter was found below 8.4%. The analysis of the successful design resulted in a very good agreement with published retrospective studies. The geometry optimisation of the new prototype lead to a final design with a low risk of failure. The statistical analysis confirmed the minimal risk of the optimised design over the entire population of interest. The performances of the optimised design showed a significant improvement with respect to the first prototype (+35%). Limitations. On the authors opinion the major limitation of this study is on boundary conditions. The muscular forces and the hip joint reaction were derived from the few data available in the literature, which can be considered significant but hardly representative of the entire variability of boundary conditions the implant might face over the patients population. This moved the focus of the research on modelling the musculoskeletal system; the ongoing activity is to develop subject-specific musculoskeletal models of the lower limb from medical images. Conclusions. The developed protocol was able to accurately predict known clinical outcomes when applied to a well-established device and, to support the design optimisation phase providing important information on critical characteristics of the patients when applied to a new prosthesis. The presented approach does have a relevant generality that would allow the extension of the protocol to a large set of orthopaedic scenarios with minor changes. Hence, a failure mode analysis criterion can be considered a suitable tool in developing new orthopaedic devices.
Resumo:
The relatively young discipline of astronautics represents one of the scientifically most fascinating and technologically advanced achievements of our time. The human exploration in space does not offer only extraordinary research possibilities but also demands high requirements from man and technology. The space environment provides a lot of attractive experimental tools towards the understanding of fundamental mechanism in natural sciences. It has been shown that especially reduced gravity and elevated radiation, two distinctive factors in space, influence the behavior of biological systems significantly. For this reason one of the key objectives on board of an earth orbiting laboratory is the research in the field of life sciences, covering the broad range from botany, human physiology and crew health up to biotechnology. The Columbus Module is the only European low gravity platform that allows researchers to perform ambitious experiments in a continuous time frame up to several months. Biolab is part of the initial outfitting of the Columbus Laboratory; it is a multi-user facility supporting research in the field of biology, e.g. effect of microgravity and space radiation on cell cultures, micro-organisms, small plants and small invertebrates. The Biolab IEC are projects designed to work in the automatic part of Biolab. In this moment in the TO-53 department of Airbus Defence & Space (formerly Astrium) there are two experiments that are in phase C/D of the development and they are the subject of this thesis: CELLRAD and CYTOSKELETON. They will be launched in soft configuration, that means packed inside a block of foam that has the task to reduce the launch loads on the payload. Until 10 years ago the payloads which were launched in soft configuration were supposed to be structural safe by themselves and a specific structural analysis could be waived on them; with the opening of the launchers market to private companies (that are not under the direct control of the international space agencies), the requirements on the verifications of payloads are changed and they have become much more conservative. In 2012 a new random environment has been introduced due to the new Space-X launch specification that results to be particularly challenging for the soft launched payloads. The last ESA specification requires to perform structural analysis on the payload for combined loads (random vibration, quasi-steady acceleration and pressure). The aim of this thesis is to create FEM models able to reproduce the launch configuration and to verify that all the margins of safety are positive and to show how they change because of the new Space-X random environment. In case the results are negative, improved design solution are implemented. Based on the FEM result a study of the joins has been carried out and, when needed, a crack growth analysis has been performed.
Resumo:
Design rights represent an interesting example of how the EU legislature has successfully regulated an otherwise heterogeneous field of law. Yet this type of protection is not for all. The tools created by EU intervention have been drafted paying much more attention to the industry sector rather than to designers themselves. In particular, modern, digitally based, individual or small-sized, 3D printing, open designers and their needs are largely neglected by such legislation. There is obviously nothing wrong in drafting legal tools around the needs of an industrial sector with an important role in the EU economy, on the contrary, this is a legitimate and good decision of industrial policy. However, good legislation should be fair, balanced, and (technologically) neutral in order to offer suitable solutions to all the players in the market, and all the citizens in the society, without discriminating the smallest or the newest: the cost would be to stifle innovation. The use of printing machinery to manufacture physical objects created digitally thanks to computer programs such as Computer-Aided Design (CAD) software has been in place for quite a few years, and it is actually the standard in many industrial fields, from aeronautics to home furniture. The change in recent years that has the potential to be a paradigm-shifting factor is a combination between the opularization of such technologies (price, size, usability, quality) and the diffusion of a culture based on access to and reuse of knowledge. We will call this blend Open Design. It is probably still too early, however, to say whether 3D printing will be used in the future to refer to a major event in human history, or instead will be relegated to a lonely Wikipedia entry similarly to ³Betamax² (copyright scholars are familiar with it for other reasons). It is not too early, however, to develop a legal analysis that will hopefully contribute to clarifying the major issues found in current EU design law structure, why many modern open designers will probably find better protection in copyright, and whether they can successfully rely on open licenses to achieve their goals. With regard to the latter point, we will use Creative Commons (CC) licenses to test our hypothesis due to their unique characteristic to be modular, i.e. to have different license elements (clauses) that licensors can choose in order to adapt the license to their own needs.”
Resumo:
Purpose This paper furthers the analysis of patterns regulating capitalist accumulation based on a historical anthropology of economic activities revolving around and within the Mauritian Export Processing Zone (EPZ). Design/methodology/approach This paper uses fieldwork in Mauritius to interrogate and critique two important concepts in contemporary social theory – “embeddedness” and “the informal economy.” These are viewed in the wider frame of social anthropology’s engagement with (neoliberal) capitalism. Findings A process-oriented revision of Polanyi’s work on embeddedness and the “double movement” is proposed to help us situate EPZs within ongoing power struggles found throughout the history of capitalism. This helps us to challenge the notion of economic informality as supplied by Hart and others. Social implications Scholars and policymakers have tended to see economic informality as a force from below, able to disrupt the legal-rational nature of capitalism as practiced from on high. Similarly, there is a view that a precapitalist embeddedness, a “human economy,” has many good things to offer. However, this paper shows that the practices of the state and multinational capitalism, in EPZs and elsewhere, exactly match the practices that are envisioned as the cure to the pitfalls of capitalism. Value of the paper Setting aside the formal-informal distinction in favor of a process-oriented analysis of embeddedness allows us better to understand the shifting struggles among the state, capital, and labor.
Resumo:
This thesis consists of four essays on the design and disclosure of compensation contracts. Essays 1, 2 and 3 focus on behavioral aspects of mandatory compensation disclosure rules and of contract negotiations in agency relationships. The three experimental studies develop psychology- based theory and present results that deviate from standard economic predictions. Furthermore, the results of Essay 1 and 2 also have implications for firms’ discretion in how to communicate their top management’s incentives to the capital market. Essay 4 analyzes the role of fairness perceptions for the evaluation of executive compensation. For this purpose, two surveys targeting representative eligible voters as well as investment professionals were conducted. Essay 1 investigates the role of the detailed ‘Compensation Discussion and Analysis’, which is part of the Security and Exchange Commission’s 2006 regulation, on investors’ evaluations of executive performance. Compensation disclosure complying with this regulation clarifies the relationship between realized reported compensation and the underlying performance measures and their target achievement levels. The experimental findings suggest that the salient presentation of executives’ incentives inherent in the ‘Compensation Discussion and Analysis’ makes investors’ performance evaluations less outcome dependent. Therefore, investors’ judgment and investment decisions might be less affected by noisy environmental factors that drive financial performance. The results also suggest that fairness perceptions of compensation contracts are essential for investors’ performance evaluations in that more transparent disclosure increases the perceived fairness of compensation and the performance evaluation of managers who are not responsible for a bad financial performance. These results have important practical implications as firms might choose to communicate their top management’s incentive compensation more transparently in order to benefit from less volatile expectations about their future performance. Similar to the first experiment, the experiment described in Essay 2 addresses the question of more transparent compensation disclosure. However, other than the first experiment, the second experiment does not analyze the effect of a more salient presentation of contract information but the informational effect of contract information itself. For this purpose, the experiment tests two conditions in which the assessment of the compensation contracts’ incentive compatibility, which determines executive effort, is either possible or not. On the one hand, the results suggest that the quality of investors’ expectations about executive effort is improved, but on the other hand investors might over-adjust their prior expectations about executive effort if being confronted with an unexpected financial performance and under-adjust if the financial performance confirms their prior expectations. Therefore, in the experiment, more transparent compensation disclosure does not lead to more correct overall judgments of executive effort and to even lower processing quality of outcome information. These results add to the literature on disclosure which predominantly advocates more transparency. The findings of the experiment however, identify decreased information processing quality as a relevant disclosure cost category. Firms might therefore carefully evaluate the additional costs and benefits of more transparent compensation disclosure. Together with the results from the experiment in Essay 1, the two experiments on compensation disclosure imply that firms should rather focus on their discretion how to present their compensation disclosure to benefit from investors’ improved fairness perceptions and their spill-over on performance evaluation. Essay 3 studies the behavioral effects of contextual factors in recruitment processes that do not affect the employer’s or the applicant’s bargaining power from a standard economic perspective. In particular, the experiment studies two common characteristics of recruitment processes: Pre-contractual competition among job applicants and job applicants’ non-binding effort announcements as they might be made during job interviews. Despite the standard economic irrelevance of these factors, the experiment develops theory regarding the behavioral effects on employees’ subsequent effort provision and the employers’ contract design choices. The experimental findings largely support the predictions. More specifically, the results suggest that firms can benefit from increased effort and, therefore, may generate higher profits. Further, firms may seize a larger share of the employment relationship’s profit by highlighting the competitive aspects of the recruitment process and by requiring applicants to make announcements about their future effort. Finally, Essay 4 studies the role of fairness perceptions for the public evaluation of executive compensation. Although economic criteria for the design of incentive compensation generally do not make restrictive recommendations with regard to the amount of compensation, fairness perceptions might be relevant from the perspective of firms and standard setters. This is because behavioral theory has identified fairness as an important determinant of individuals’ judgment and decisions. However, although fairness concerns about executive compensation are often stated in the popular media and even in the literature, evidence on the meaning of fairness in the context of executive compensation is scarce and ambiguous. In order to inform practitioners and standard setters whether fairness concerns are exclusive to non-professionals or relevant for investment professionals as well, the two surveys presented in Essay 4 aim to find commonalities in the opinions of representative eligible voters and investments professionals. The results suggest that fairness is an important criterion for both groups. Especially, exposure to risk in the form of the variable compensation share is an important criterion shared by both groups. The higher the assumed variable share, the higher is the compensation amount to be perceived as fair. However, to a large extent, opinions on executive compensation depend on personality characteristics, and to some extent, investment professionals’ perceptions deviate systematically from those of non-professionals. The findings imply that firms might benefit from emphasizing the riskiness of their managers’ variable pay components and, therefore, the findings are also in line with those of Essay 1.
Resumo:
We use a novel dataset and research design to empirically detect the effect of social interactions among neighbors on labor market outcomes. Specifically, using Census data that characterize residential and employment locations down to the city block, we examine whether individuals residing in the same block are more likely to work together than individuals in nearby but not identical blocks. We find significant evidence of social interactions operating at the block level: residing on the same versus nearby blocks increases the probability of working together by over 33 percent. The results also indicate that this referral effect is stronger when individuals are similar in sociodemographic characteristics (e.g., both have children of similar ages) and when at least one individual is well attached to the labor market. These findings are robust across various specifications intended to address concerns related to sorting and reverse causation. Further, having determined the characteristics of a pair of individuals that lead to an especially strong referral effect, we provide evidence that the increased availability of neighborhood referrals has a significant impact on a wide range of labor market outcomes including employment and wages.
Resumo:
Hospital districts (HD) that serve the uninsured and the needy face new challenges with the implementation of Medicaid managed. The potential loss of Medicaid patients and revenues may affect the ability to cost-shift and subsequently decrease the ability of the HD to meet its legal obligation of providing care for the uninsured. ^ To investigate HD viability in the current market, the aims of this study were to: (1) describe HD's environment, (2) document the HDs strategic response, (3) document changes in the HD's performance (patient volume) and financial status, and (4) determine whether relationships or trends exist between HD strategy, performance and financial status. ^ To achieve these aims, three Texas HDs (Fort Worth, Lubbock, and San Antonio) were selected to be evaluated. For each HD four types of strategic responses were documented and evaluated for change. In addition, the ability of each HD to sustain operations was evaluated by documenting performance and financial status changes (patient volume and financial ratios). A pre-post case study design method was used in which the Medicaid managed care “rollout'” date, at each site, was the central date. First, a descriptive analysis was performed which documented the environment, strategy, financial status, and patient volume of each hospital district. Second, to compare hospital districts, each hospital district was: (i) classified by a risk index, (ii) classified by its strategic response profile, and (iii) given a performance score based upon pre-post changes in patient volume and financial indicators. ^ Results indicated that all three HDs operate in a high risk environment compared to the rest of the nation. Two HDs chose the “Status Quo” response whereas one HD chose the “Competitive Proactive” response. Medicaid patient volume decreased in two of three HDs whereas indigent patient volume increased in two of the three (an indication of increasing financial risk). Total patient revenues for all HDs increased over the study period; however, the rate of increase slowed for all three after the Medicaid rollout date. All HDs experienced a decline in financial status between pre-post periods with the greatest decline observed in the HD that saw the greatest increase in indigent patient volume. ^ The pre-post case study format used and the lack of control study sites do not allow for assignment of causality. However, the results suggest possible adverse effects of Medicaid managed care and the need for a larger study, based on a stronger evaluation research design. ^
Resumo:
Improving energy efficiency is an unarguable emergent issue in developing economies and an energy efficiency standard and labeling program is an ideal mechanism to achieve this target. However, there is concern regarding whether the consumers will choose the highly energy efficient appliances because of its high price in consequence of the high cost. This paper estimates how the consumer responds to introduction of the energy efficiency standard and labeling program in China. To quantify evaluation by consumers, we estimated their consumer surplus and the benefits of products based on the estimated parameters of demand function. We found the following points. First, evaluation of energy efficiency labeling by the consumer is not monotonically correlated with the number of grades. The highest efficiency label (Label 1) is not evaluated to be no less higher than labels 2 and 3, and is sometimes lower than the least energy efficient label (Label UI). This goes against the design of policy intervention. Second, several governmental policies affects in mixed directions: the subsidies for energy saving policies to the highest degree of the labels contribute to expanding consumer welfare as the program was designed. However, the replacement for new appliances policies decreased the welfare.
Resumo:
Lately the short-wave infrared (SWIR) has become very important due to the recent appearance on the market of the small detectors with a large focal plane array. Military applications for SWIR cameras include handheld and airborne systems with long range detection requirements, but where volume and weight restrictions must be considered. In this paper we present three different designs of telephoto objectives that have been designed according to three different methods. Firstly the conventional method where the starting point of the design is an existing design. Secondly we will face design starting from the design of an aplanatic system. And finally the simultaneous multiple surfaces (SMS) method, where the starting point is the input wavefronts that we choose. The designs are compared in terms of optical performance, volume, weight and manufacturability. Because the objectives have been designed for the SWIR waveband, the color correction has important implications in the choice of glass that will be discussed in detail
Resumo:
The simulation of interest rate derivatives is a powerful tool to face the current market fluctuations. However, the complexity of the financial models and the way they are processed require exorbitant computation times, what is in clear conflict with the need of a processing time as short as possible to operate in the financial market. To shorten the computation time of financial derivatives the use of hardware accelerators becomes a must.
Resumo:
Esta tesis está incluida dentro del campo del campo de Multiband Orthogonal Frequency Division Multiplexing Ultra Wideband (MB-OFDM UWB), el cual ha adquirido una gran importancia en las comunicaciones inalámbricas de alta tasa de datos en la última década. UWB surgió con el objetivo de satisfacer la creciente demanda de conexiones inalámbricas en interiores y de uso doméstico, con bajo coste y alta velocidad. La disponibilidad de un ancho de banda grande, el potencial para alta velocidad de transmisión, baja complejidad y bajo consumo de energía, unido al bajo coste de implementación, representa una oportunidad única para que UWB se convierta en una solución ampliamente utilizada en aplicaciones de Wireless Personal Area Network (WPAN). UWB está definido como cualquier transmisión que ocupa un ancho de banda de más de 20% de su frecuencia central, o más de 500 MHz. En 2002, la Comisión Federal de Comunicaciones (FCC) definió que el rango de frecuencias de transmisión de UWB legal es de 3.1 a 10.6 GHz, con una energía de transmisión de -41.3 dBm/Hz. Bajo las directrices de FCC, el uso de la tecnología UWB puede aportar una enorme capacidad en las comunicaciones de corto alcance. Considerando las ecuaciones de capacidad de Shannon, incrementar la capacidad del canal requiere un incremento lineal en el ancho de banda, mientras que un aumento similar de la capacidad de canal requiere un aumento exponencial en la energía de transmisión. En los últimos años, s diferentes desarrollos del UWB han sido extensamente estudiados en diferentes áreas, entre los cuales, el protocolo de comunicaciones inalámbricas MB-OFDM UWB está considerado como la mejor elección y ha sido adoptado como estándar ISO/IEC para los WPANs. Combinando la modulación OFDM y la transmisión de datos utilizando las técnicas de salto de frecuencia, el sistema MB-OFDM UWB es capaz de soportar tasas de datos con que pueden variar de los 55 a los 480 Mbps, alcanzando una distancia máxima de hasta 10 metros. Se esperara que la tecnología MB-OFDM tenga un consumo energético muy bajo copando un are muy reducida en silicio, proporcionando soluciones de bajo coste que satisfagan las demandas del mercado. Para cumplir con todas estas expectativas, el desarrollo y la investigación del MBOFDM UWB deben enfrentarse a varios retos, como son la sincronización de alta sensibilidad, las restricciones de baja complejidad, las estrictas limitaciones energéticas, la escalabilidad y la flexibilidad. Tales retos requieren un procesamiento digital de la señal de última generación, capaz de desarrollar sistemas que puedan aprovechar por completo las ventajas del espectro UWB y proporcionar futuras aplicaciones inalámbricas en interiores. Esta tesis se centra en la completa optimización de un sistema de transceptor de banda base MB-OFDM UWB digital, cuyo objetivo es investigar y diseñar un subsistema de comunicación inalámbrica para la aplicación de las Redes de Sensores Inalámbricas Visuales. La complejidad inherente de los procesadores FFT/IFFT y el sistema de sincronización así como la alta frecuencia de operación para todos los elementos de procesamiento, se convierten en el cuello de la botella para el diseño y la implementación del sistema de UWB digital en base de banda basado en MB-OFDM de baja energía. El objetivo del transceptor propuesto es conseguir baja energía y baja complejidad bajo la premisa de un alto rendimiento. Las optimizaciones están realizadas tanto a nivel algorítmico como a nivel arquitectural para todos los elementos del sistema. Una arquitectura hardware eficiente en consumo se propone en primer lugar para aquellos módulos correspondientes a núcleos de computación. Para el procesado de la Transformada Rápida de Fourier (FFT/IFFT), se propone un algoritmo mixed-radix, basado en una arquitectura con pipeline y se ha desarrollado un módulo de Decodificador de Viterbi (VD) equilibrado en coste-velocidad con el objetivo de reducir el consumo energético e incrementar la velocidad de procesamiento. También se ha implementado un correlador signo-bit simple basado en la sincronización del tiempo de símbolo es presentado. Este correlador es usado para detectar y sincronizar los paquetes de OFDM de forma robusta y precisa. Para el desarrollo de los subsitemas de procesamiento y realizar la integración del sistema completo se han empleado tecnologías de última generación. El dispositivo utilizado para el sistema propuesto es una FPGA Virtex 5 XC5VLX110T del fabricante Xilinx. La validación el propuesta para el sistema transceptor se ha implementado en dicha placa de FPGA. En este trabajo se presenta un algoritmo, y una arquitectura, diseñado con filosofía de co-diseño hardware/software para el desarrollo de sistemas de FPGA complejos. El objetivo principal de la estrategia propuesta es de encontrar una metodología eficiente para el diseño de un sistema de FPGA configurable optimizado con el empleo del mínimo esfuerzo posible en el sistema de procedimiento de verificación, por tanto acelerar el periodo de desarrollo del sistema. La metodología de co-diseño presentada tiene la ventaja de ser fácil de usar, contiene todos los pasos desde la propuesta del algoritmo hasta la verificación del hardware, y puede ser ampliamente extendida para casi todos los tipos de desarrollos de FPGAs. En este trabajo se ha desarrollado sólo el sistema de transceptor digital de banda base por lo que la comprobación de señales transmitidas a través del canal inalámbrico en los entornos reales de comunicación sigue requiriendo componentes RF y un front-end analógico. No obstante, utilizando la metodología de co-simulación hardware/software citada anteriormente, es posible comunicar el sistema de transmisor y el receptor digital utilizando los modelos de canales propuestos por IEEE 802.15.3a, implementados en MATLAB. Por tanto, simplemente ajustando las características de cada modelo de canal, por ejemplo, un incremento del retraso y de la frecuencia central, podemos estimar el comportamiento del sistema propuesto en diferentes escenarios y entornos. Las mayores contribuciones de esta tesis son: • Se ha propuesto un nuevo algoritmo 128-puntos base mixto FFT usando la arquitectura pipeline multi-ruta. Los complejos multiplicadores para cada etapa de procesamiento son diseñados usando la arquitectura modificada shiftadd. Los sistemas word length y twiddle word length son comparados y seleccionados basándose en la señal para cuantización del SQNR y el análisis de energías. • El desempeño del procesador IFFT es analizado bajo diferentes situaciones aritméticas de bloques de punto flotante (BFP) para el control de desbordamiento, por tanto, para encontrar la arquitectura perfecta del algoritmo IFFT basado en el procesador FFT propuesto. • Para el sistema de receptor MB-OFDM UWB se ha empleado una sincronización del tiempo innovadora, de baja complejidad y esquema de compensación, que consiste en funciones de Detector de Paquetes (PD) y Estimación del Offset del tiempo. Simplificando el cross-correlation y maximizar las funciones probables solo a sign-bit, la complejidad computacional se ve reducida significativamente. • Se ha propuesto un sistema de decodificadores Viterbi de 64 estados de decisión-débil usando velocidad base-4 de arquitectura suma-comparaselecciona. El algoritmo Two-pointer Even también es introducido en la unidad de rastreador de origen con el objetivo de conseguir la eficiencia en el hardware. • Se han integrado varias tecnologías de última generación en el completo sistema transceptor basebanda , con el objetivo de implementar un sistema de comunicación UWB altamente optimizado. • Un diseño de flujo mejorado es propuesto para el complejo sistema de implementación, el cual puede ser usado para diseños de Cadena de puertas de campo programable general (FPGA). El diseño mencionado no sólo reduce dramáticamente el tiempo para la verificación funcional, sino también provee un análisis automático como los errores del retraso del output para el sistema de hardware implementado. • Un ambiente de comunicación virtual es establecido para la validación del propuesto sistema de transceptores MB-OFDM. Este método es provisto para facilitar el uso y la conveniencia de analizar el sistema digital de basebanda sin parte frontera analógica bajo diferentes ambientes de comunicación. Esta tesis doctoral está organizada en seis capítulos. En el primer capítulo se encuentra una breve introducción al campo del UWB, tanto relacionado con el proyecto como la motivación del desarrollo del sistema de MB-OFDM. En el capítulo 2, se presenta la información general y los requisitos del protocolo de comunicación inalámbrica MBOFDM UWB. En el capítulo 3 se habla de la arquitectura del sistema de transceptor digital MB-OFDM de banda base . El diseño del algoritmo propuesto y la arquitectura para cada elemento del procesamiento está detallado en este capítulo. Los retos de diseño del sistema que involucra un compromiso de discusión entre la complejidad de diseño, el consumo de energía, el coste de hardware, el desempeño del sistema, y otros aspectos. En el capítulo 4, se ha descrito la co-diseñada metodología de hardware/software. Cada parte del flujo del diseño será detallado con algunos ejemplos que se ha hecho durante el desarrollo del sistema. Aprovechando esta estrategia de diseño, el procedimiento de comunicación virtual es llevado a cabo para probar y analizar la arquitectura del transceptor propuesto. Los resultados experimentales de la co-simulación y el informe sintético de la implementación del sistema FPGA son reflejados en el capítulo 5. Finalmente, en el capítulo 6 se incluye las conclusiones y los futuros proyectos, y también los resultados derivados de este proyecto de doctorado. ABSTRACT In recent years, the Wireless Visual Sensor Network (WVSN) has drawn great interest in wireless communication research area. They enable a wealth of new applications such as building security control, image sensing, and target localization. However, nowadays wireless communication protocols (ZigBee, Wi-Fi, and Bluetooth for example) cannot fully satisfy the demands of high data rate, low power consumption, short range, and high robustness requirements. New communication protocol is highly desired for such kind of applications. The Ultra Wideband (UWB) wireless communication protocol, which has increased in importance for high data rate wireless communication field, are emerging as an important topic for WVSN research. UWB has emerged as a technology that offers great promise to satisfy the growing demand for low-cost, high-speed digital wireless indoor and home networks. The large bandwidth available, the potential for high data rate transmission, and the potential for low complexity and low power consumption, along with low implementation cost, all present a unique opportunity for UWB to become a widely adopted radio solution for future Wireless Personal Area Network (WPAN) applications. UWB is defined as any transmission that occupies a bandwidth of more than 20% of its center frequency, or more than 500 MHz. In 2002, the Federal Communications Commission (FCC) has mandated that UWB radio transmission can legally operate in the range from 3.1 to 10.6 GHz at a transmitter power of -41.3 dBm/Hz. Under the FCC guidelines, the use of UWB technology can provide enormous capacity over short communication ranges. Considering Shannon’s capacity equations, increasing the channel capacity requires linear increasing in bandwidth, whereas similar channel capacity increases would require exponential increases in transmission power. In recent years, several different UWB developments has been widely studied in different area, among which, the MB-OFDM UWB wireless communication protocol is considered to be the leading choice and has recently been adopted in the ISO/IEC standard for WPANs. By combing the OFDM modulation and data transmission using frequency hopping techniques, the MB-OFDM UWB system is able to support various data rates, ranging from 55 to 480 Mbps, over distances up to 10 meters. The MB-OFDM technology is expected to consume very little power and silicon area, as well as provide low-cost solutions that can satisfy consumer market demands. To fulfill these expectations, MB-OFDM UWB research and development have to cope with several challenges, which consist of high-sensitivity synchronization, low- complexity constraints, strict power limitations, scalability, and flexibility. Such challenges require state-of-the-art digital signal processing expertise to develop systems that could fully take advantages of the UWB spectrum and support future indoor wireless applications. This thesis focuses on fully optimization for the MB-OFDM UWB digital baseband transceiver system, aiming at researching and designing a wireless communication subsystem for the Wireless Visual Sensor Networks (WVSNs) application. The inherent high complexity of the FFT/IFFT processor and synchronization system, and high operation frequency for all processing elements, becomes the bottleneck for low power MB-OFDM based UWB digital baseband system hardware design and implementation. The proposed transceiver system targets low power and low complexity under the premise of high performance. Optimizations are made at both algorithm and architecture level for each element of the transceiver system. The low-power hardwareefficient structures are firstly proposed for those core computation modules, i.e., the mixed-radix algorithm based pipelined architecture is proposed for the Fast Fourier Transform (FFT/IFFT) processor, and the cost-speed balanced Viterbi Decoder (VD) module is developed, in the aim of lowering the power consumption and increasing the processing speed. In addition, a low complexity sign-bit correlation based symbol timing synchronization scheme is presented so as to detect and synchronize the OFDM packets robustly and accurately. Moreover, several state-of-the-art technologies are used for developing other processing subsystems and an entire MB-OFDM digital baseband transceiver system is integrated. The target device for the proposed transceiver system is Xilinx Virtex 5 XC5VLX110T FPGA board. In order to validate the proposed transceiver system in the FPGA board, a unified algorithm-architecture-circuit hardware/software co-design environment for complex FPGA system development is presented in this work. The main objective of the proposed strategy is to find an efficient methodology for designing a configurable optimized FPGA system by using as few efforts as possible in system verification procedure, so as to speed up the system development period. The presented co-design methodology has the advantages of easy to use, covering all steps from algorithm proposal to hardware verification, and widely spread for almost all kinds of FPGA developments. Because only the digital baseband transceiver system is developed in this thesis, the validation of transmitting signals through wireless channel in real communication environments still requires the analog front-end and RF components. However, by using the aforementioned hardware/software co-simulation methodology, the transmitter and receiver digital baseband systems get the opportunity to communicate with each other through the channel models, which are proposed from the IEEE 802.15.3a research group, established in MATLAB. Thus, by simply adjust the characteristics of each channel model, e.g. mean excess delay and center frequency, we can estimate the transmission performance of the proposed transceiver system through different communication situations. The main contributions of this thesis are: • A novel mixed radix 128-point FFT algorithm by using multipath pipelined architecture is proposed. The complex multipliers for each processing stage are designed by using modified shift-add architectures. The system wordlength and twiddle word-length are compared and selected based on Signal to Quantization Noise Ratio (SQNR) and power analysis. • IFFT processor performance is analyzed under different Block Floating Point (BFP) arithmetic situations for overflow control, so as to find out the perfect architecture of IFFT algorithm based on the proposed FFT processor. • An innovative low complex timing synchronization and compensation scheme, which consists of Packet Detector (PD) and Timing Offset Estimation (TOE) functions, for MB-OFDM UWB receiver system is employed. By simplifying the cross-correlation and maximum likelihood functions to signbit only, the computational complexity is significantly reduced. • A 64 state soft-decision Viterbi Decoder system by using high speed radix-4 Add-Compare-Select architecture is proposed. Two-pointer Even algorithm is also introduced into the Trace Back unit in the aim of hardware-efficiency. • Several state-of-the-art technologies are integrated into the complete baseband transceiver system, in the aim of implementing a highly-optimized UWB communication system. • An improved design flow is proposed for complex system implementation which can be used for general Field-Programmable Gate Array (FPGA) designs. The design method not only dramatically reduces the time for functional verification, but also provides automatic analysis such as errors and output delays for the implemented hardware systems. • A virtual communication environment is established for validating the proposed MB-OFDM transceiver system. This methodology is proved to be easy for usage and convenient for analyzing the digital baseband system without analog frontend under different communication environments. This PhD thesis is organized in six chapters. In the chapter 1 a brief introduction to the UWB field, as well as the related work, is done, along with the motivation of MBOFDM system development. In the chapter 2, the general information and requirement of MB-OFDM UWB wireless communication protocol is presented. In the chapter 3, the architecture of the MB-OFDM digital baseband transceiver system is presented. The design of the proposed algorithm and architecture for each processing element is detailed in this chapter. Design challenges of such system involve trade-off discussions among design complexity, power consumption, hardware cost, system performance, and some other aspects. All these factors are analyzed and discussed. In the chapter 4, the hardware/software co-design methodology is proposed. Each step of this design flow will be detailed by taking some examples that we met during system development. Then, taking advantages of this design strategy, the Virtual Communication procedure is carried out so as to test and analyze the proposed transceiver architecture. Experimental results from the co-simulation and synthesis report of the implemented FPGA system are given in the chapter 5. The chapter 6 includes conclusions and future work, as well as the results derived from this PhD work.