985 resultados para Established system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A patient classification system was developed integrating a patient acuity instrument with a computerized nursing distribution method based on a linear programming model. The system was designed for real-time measurement of patient acuity (workload) and allocation of nursing personnel to optimize the utilization of resources.^ The acuity instrument was a prototype tool with eight categories of patients defined by patient severity and nursing intensity parameters. From this tool, the demand for nursing care was defined in patient points with one point equal to one hour of RN time. Validity and reliability of the instrument was determined as follows: (1) Content validity by a panel of expert nurses; (2) predictive validity through a paired t-test analysis of preshift and postshift categorization of patients; (3) initial reliability by a one month pilot of the instrument in a practice setting; and (4) interrater reliability by the Kappa statistic.^ The nursing distribution system was a linear programming model using a branch and bound technique for obtaining integer solutions. The objective function was to minimize the total number of nursing personnel used by optimally assigning the staff to meet the acuity needs of the units. A penalty weight was used as a coefficient of the objective function variables to define priorities for allocation of staff.^ The demand constraints were requirements to meet the total acuity points needed for each unit and to have a minimum number of RNs on each unit. Supply constraints were: (1) total availability of each type of staff and the value of that staff member (value was determined relative to that type of staff's ability to perform the job function of an RN (i.e., value for eight hours RN = 8 points, LVN = 6 points); (2) number of personnel available for floating between units.^ The capability of the model to assign staff quantitatively and qualitatively equal to the manual method was established by a thirty day comparison. Sensitivity testing demonstrated appropriate adjustment of the optimal solution to changes in penalty coefficients in the objective function and to acuity totals in the demand constraints.^ Further investigation of the model documented: correct adjustment of assignments in response to staff value changes; and cost minimization by an addition of a dollar coefficient to the objective function. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Beginning in the early 1980s, the health care system experienced momentous realignments. Fundamental changes in structures of traditional health care organizations, shifts in authority and relationships of professionals and institutions, and the increasing influence of managed care contributed to a relatively stable industry entering into a state of turbulence. The dynamics of these changes are recurring themes in the health services literature. The purpose of this dissertation was to examine the content of this literature over a defined time period and within the perspective of a theory of organizational change. ^ Using a theoretical framework based upon the organizational theory known as Organizational Ecology, secondary data from the period between 1983 and 1994 was reviewed. Analysis of the literature identified through a defined search methodology was focused upon determining the manner in which the literature characterized changes that were described. Using a model constructed from fundamentals of Organizational Ecology with which to structure an assessment of content, literature was summarized for the manner and extent of change in specific organizational forms and for the changes in emphasis by the environmental dynamics directing changes in the population of organizations. Although it was not the intent of the analysis to substantiate causal relationships between environmental resources selected as the determinants of organizational change and the observed changes in organizational forms, the structured review of content of the literature established a strong basis for inferring such a relationship. ^ The results of the integrative review of the literature and the power of the appraisal achieved through the theoretical framework constructed for the analysis indicate that there is considerable value in such an approach. An historical perspective on changes which have transformed the health care system developed within a defined organizational theory provide a unique insight into these changes and indicate the need for further development of such an analytical model. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS Information on tumour border configuration (TBC) in colorectal cancer (CRC) is currently not included in most pathology reports, owing to lack of reproducibility and/or established evaluation systems. The aim of this study was to investigate whether an alternative scoring system based on the percentage of the infiltrating component may represent a reliable method for assessing TBC. METHODS AND RESULTS Two hundred and fifteen CRCs with complete clinicopathological data were evaluated by two independent observers, both 'traditionally' by assigning the tumours into pushing/infiltrating/mixed categories, and alternatively by scoring the percentage of infiltrating margin. With the pushing/infiltrating/mixed pattern method, interobserver agreement (IOA) was moderate (κ = 0.58), whereas with the percentage of infiltrating margins method, IOA was excellent (intraclass correlation coefficient of 0.86). A higher percentage of infiltrating margin correlated with adverse features such as higher grade (P = 0.0025), higher pT (P = 0.0007), pN (P = 0.0001) and pM classification (P = 0.0063), high-grade tumour budding (P < 0.0001), lymphatic invasion (P < 0.0001), vascular invasion (P = 0.0032), and shorter survival (P = 0.0008), and was significantly associated with an increased probability of lymph node metastasis (P < 0.001). CONCLUSIONS Information on TBC gives additional prognostic value to pathology reports on CRC. The novel proposed scoring system, by using the percentage of infiltrating margin, outperforms the 'traditional' way of reporting TBC. Additionally, it is reproducible and simple to apply, and can therefore be easily integrated into daily diagnostic practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper revives a theoretical definition of party coherence as being composed of two basic elements, cohesion and factionalism, to propose and apply a novel empirical measure based on spin physics. The simultaneous analysis of both components using a single measurement concept is applied to data representing the political beliefs of candidates in the Swiss general elections of 2003 and 2007, proposing a connection between the coherence of the beliefs party members hold and the assessment of parties being at risk of splitting. We also compare our measure with established polarization measures and demonstrate its advantage with respect to multi-dimensional data that lack clear structure. Furthermore, we outline how our analysis supports the distinction between bottom-up and top-down mechanisms of party splitting. In this way, we are able to turn the intuition of coherence into a defined quantitative concept that, additionally, offers a methodological basis for comparative research of party coherence. Our work serves as an example of how a complex systems approach allows to get a new perspective on a long-standing issue in political science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Intraarticular gadolinium-enhanced magnetic resonance arthrography (MRA) is commonly applied to characterize morphological disorders of the hip. However, the reproducibility of retrieving anatomic landmarks on MRA scans and their correlation with intraarticular pathologies is unknown. A precise mapping system for the exact localization of hip pathomorphologies with radial MRA sequences is lacking. Therefore, the purpose of the study was the establishment and validation of a reproducible mapping system for radial sequences of hip MRA. MATERIALS AND METHODS Sixty-nine consecutive intraarticular gadolinium-enhanced hip MRAs were evaluated. Radial sequencing consisted of 14 cuts orientated along the axis of the femoral neck. Three orthopedic surgeons read the radial sequences independently. Each MRI was read twice with a minimum interval of 7 days from the first reading. The intra- and inter-observer reliability of the mapping procedure was determined. RESULTS A clockwise system for hip MRA was established. The teardrop figure served to determine the 6 o'clock position of the acetabulum; the center of the greater trochanter served to determine the 12 o'clock position of the femoral head-neck junction. The intra- and inter-observer ICCs to retrieve the correct 6/12 o'clock positions were 0.906-0.996 and 0.978-0.988, respectively. CONCLUSIONS The established mapping system for radial sequences of hip joint MRA is reproducible and easy to perform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Each year about 650,000 Europeans die from stroke and a similar number lives with the sequelae of multiple sclerosis (MS). Stroke and MS differ in their etiology. Although cause and likewise clinical presentation set the two diseases apart, they share common downstream mechanisms that lead to damage and recovery. Demyelination and axonal injury are characteristics of MS but are also observed in stroke. Conversely, hallmarks of stroke, such as vascular impairment and neurodegeneration, are found in MS. However, the most conspicuous common feature is the marked neuroinflammatory response, marked by glia cell activation and immune cell influx. In MS and stroke the blood-brain barrier is disrupted allowing bone marrow-derived macrophages to invade the brain in support of the resident microglia. In addition, there is a massive invasion of auto-reactive T-cells into the brain of patients with MS. Though less pronounced a similar phenomenon is also found in ischemic lesions. Not surprisingly, the two diseases also resemble each other at the level of gene expression and the biosynthesis of other proinflammatory mediators. While MS has traditionally been considered to be an autoimmune neuroinflammatory disorder, the role of inflammation for cerebral ischemia has only been recognized later. In the case of MS the long track record as neuroinflammatory disease has paid off with respect to treatment options. There are now about a dozen of approved drugs for the treatment of MS that specifically target neuroinflammation by modulating the immune system. Interestingly, experimental work demonstrated that drugs that are in routine use to mitigate neuroinflammation in MS may also work in stroke models. Examples include Fingolimod, glatiramer acetate, and antibodies blocking the leukocyte integrin VLA-4. Moreover, therapeutic strategies that were discovered in experimental autoimmune encephalomyelitis (EAE), the animal model of MS, turned out to be also effective in experimental stroke models. This suggests that previous achievements in MS research may be relevant for stroke. Interestingly, the converse is equally true. Concepts on the neurovascular unit that were developed in a stroke context turned out to be applicable to neuroinflammatory research in MS. Examples include work on the important role of the vascular basement membrane and the BBB for the invasion of immune cells into the brain. Furthermore, tissue plasminogen activator (tPA), the only established drug treatment in acute stroke, modulates the pathogenesis of MS. Endogenous tPA is released from endothelium and astroglia and acts on the BBB, microglia and other neuroinflammatory cells. Thus, the vascular perspective of stroke research provides important input into the mechanisms on how endothelial cells and the BBB regulate inflammation in MS, particularly the invasion of immune cells into the CNS. In the current review we will first discuss pathogenesis of both diseases and current treatment regimens and will provide a detailed overview on pathways of immune cell migration across the barriers of the CNS and the role of activated astrocytes in this process. This article is part of a Special Issue entitled: Neuro inflammation: A common denominator for stroke, multiple sclerosis and Alzheimer's disease, guest edited by Helga de Vries and Markus Swaninger.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical observations made by practitioners and reported using web- and mobile-based technologies may benefit disease surveillance by improving the timeliness of outbreak detection. Equinella is a voluntary electronic reporting and information system established for the early detection of infectious equine diseases in Switzerland. Sentinel veterinary practitioners have been able to report cases of non-notifiable diseases and clinical symptoms to an internet-based platform since November 2013. Telephone interviews were carried out during the first year to understand the motivating and constraining factors affecting voluntary reporting and the use of mobile devices in a sentinel network. We found that non-monetary incentives attract sentinel practitioners; however, insufficient understanding of the reporting system and of its relevance, as well as concerns over the electronic dissemination of health data were identified as potential challenges to sustainable reporting. Many practitioners are not yet aware of the advantages of mobile-based surveillance and may require some time to become accustomed to novel reporting methods. Finally, our study highlights the need for continued information feedback loops within voluntary sentinel networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vertical integration is grounded in economic theory as a corporate strategy for reducing cost and enhancing efficiency. There were three purposes for this dissertation. The first was to describe and understand vertical integration theory. The review of the economic theory established vertical integration as a corporate cost reduction strategy in response to environmental, structural and performance dimensions of the market. The second purpose was to examine vertical integration in the context of the health care industry, which has greater complexity, higher instability, and more unstable demand than other industries, although many of the same dimensions of the market supported a vertical integration strategy. Evidence on the performance of health systems after integration revealed mixed results. Because the market continues to be turbulent, hybrid non-owned integration in the form of alliances have increased to over 40% of urban hospitals. The third purpose of the study was to examine the application of vertical integration in health care and evaluate the effects. The case studied was an alliance formed between a community hospital and a tertiary medical center to facilitate vertical integration of oncology services while maintaining effectiveness and preserving access. The economic benefits for 1934 patients were evaluated in the delivery system before and after integration with a more detailed economic analysis of breast, lung, colon/rectal, and non-malignant cases. A regression analysis confirmed the relationship between the independent variables of age, sex, location of services, race, stage of disease, and diagnosis, and the dependent variable, cost. The results of the basic regression model, as well as the regression with first-order interaction terms, were statistically significant. The study shows that vertical integration at an intermediate health care system level has economic benefits. If the pre-integration oncology group had been treated in the post-integration model, the expected cost savings from integration would be 31.5%. Quality indicators used were access to health care services and research treatment protocols, and access was preserved in the integrated model. Using survival as a direct quality outcome measure, the survival of lung cancer patients was statistically the same before and after integration. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Salamanca is cataloged as one of the most polluted cities in Mexico. In order to observe the behavior and clarify the influence of wind parameters on the Sulphur Dioxide (SO2) concentrations a Self-Organizing Maps (SOM) Neural Network have been implemented at three monitoring locations for the period from January 1 to December 31, 2006. The maximum and minimum daily values of SO2 concentrations measured during the year of 2006 were correlated with the wind parameters of the same period. The main advantages of the SOM Neural Network is that it allows to integrate data from different sensors and provide readily interpretation results. Especially, it is powerful mapping and classification tool, which others information in an easier way and facilitates the task of establishing an order of priority between the distinguished groups of concentrations depending on their need for further research or remediation actions in subsequent management steps. For each monitoring location, SOM classifications were evaluated with respect to pollution levels established by Health Authorities. The classification system can help to establish a better air quality monitoring methodology that is essential for assessing the effectiveness of imposed pollution controls, strategies, and facilitate the pollutants reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Massive integration of renewable energy sources in electrical power systems of remote islands is a subject of current interest. The increasing cost of fossil fuels, transport costs to isolated sites and environmental concerns constitute a serious drawback to the use of conventional fossil fuel plants. In a weak electrical grid, as it is typical on an island, if a large amount of conventional generation is substituted by renewable energy sources, power system safety and stability can be compromised, in the case of large grid disturbances. In this work, a model for transient stability analysis of an isolated electrical grid exclusively fed from a combination of renewable energy sources has been studied. This new generation model will be installed in El Hierro Island, in Spain. Additionally, an operation strategy to coordinate the generation units (wind, hydro) is also established. Attention is given to the assessment of inertial energy and reactive current to guarantee power system stability against large disturbances. The effectiveness of the proposed strategy is shown by means of simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As it is defined in ATM 2000+ Strategy (Eurocontrol 2001), the mission of the Air Traffic Management (ATM) System is: “For all the phases of a flight, the ATM system should facilitate a safe, efficient, and expedite traffic flow, through the provision of adaptable ATM services that can be dimensioned in relation to the requirements of all the users and areas of the European air space. The ATM services should comply with the demand, be compatible, operate under uniform principles, respect the environment and satisfy the national security requirements.” The objective of this paper is to present a methodology designed to evaluate the status of the ATM system in terms of the relationship between the offered capacity and traffic demand, identifying weakness areas and proposing solutions. The first part of the methodology relates to the characterization and evaluation of the current system, while a second part proposes an approach to analyze the possible development limit. As part of the work, general criteria are established to define the framework in which the analysis and diagnostic methodology presented is placed. They are: the use of Air Traffic Control (ATC) sectors as analysis unit, the presence of network effects, the tactical focus, the relative character of the analysis, objectivity and a high level assessment that allows assumptions on the human and Communications, Navigation and Surveillance (CNS) elements, considered as the typical high density air traffic resources. The steps followed by the methodology start with the definition of indicators and metrics, like the nominal criticality or the nominal efficiency of a sector; scenario characterization where the necessary data is collected; network effects analysis to study the relations among the constitutive elements of the ATC system; diagnostic by means of the “System Status Diagram”; analytical study of the ATC system development limit; and finally, formulation of conclusions and proposal for improvement. This methodology was employed by Aena (Spanish Airports Manager and Air Navigation Service Provider) and INECO (Spanish Transport Engineering Company) in the analysis of the Spanish ATM System in the frame of the Spanish airspace capacity sustainability program, although it could be applied elsewhere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, insurance is a form of risk management used to hedge against a contingent loss. The conventional definition is the equitable transfer of a risk of loss from one entity to another in exchange for a premium or a guaranteed and quantifiable small loss to prevent a large and possibly devastating loss being agricultural insurance a special line of property insurance. Agriculture insurance, as actually are designed in the Spanish scenario, were established in 1978. At the macroeconomic insurance studies scale, it is necessary to know a basic element for the insurance actuarial components: sum insured. When a new risk assessment has to be evaluated in the insurance framework, it is essential to determinate venture capital in the total Spanish agriculture. In this study, three different crops (cereal, citrus and vineyards) cases are showed to determinate sum insured as they are representative of the cases found in the Spanish agriculture. Crop sum insured is calculated by the product of crop surface, unit surface production and crop price insured. In the cereal case, winter as spring cereal sowing, represents the highest Spanish crop surface, above to 6 millions of hectares (ha). Meanwhile, the four citrus species (oranges, mandarins, lemons and grapefruits) occupied an extension just over 275.000 ha. On the other hand, vineyard target to wine process shows almost one million of ha in Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tesis está incluida dentro del campo del campo de Multiband Orthogonal Frequency Division Multiplexing Ultra Wideband (MB-OFDM UWB), el cual ha adquirido una gran importancia en las comunicaciones inalámbricas de alta tasa de datos en la última década. UWB surgió con el objetivo de satisfacer la creciente demanda de conexiones inalámbricas en interiores y de uso doméstico, con bajo coste y alta velocidad. La disponibilidad de un ancho de banda grande, el potencial para alta velocidad de transmisión, baja complejidad y bajo consumo de energía, unido al bajo coste de implementación, representa una oportunidad única para que UWB se convierta en una solución ampliamente utilizada en aplicaciones de Wireless Personal Area Network (WPAN). UWB está definido como cualquier transmisión que ocupa un ancho de banda de más de 20% de su frecuencia central, o más de 500 MHz. En 2002, la Comisión Federal de Comunicaciones (FCC) definió que el rango de frecuencias de transmisión de UWB legal es de 3.1 a 10.6 GHz, con una energía de transmisión de -41.3 dBm/Hz. Bajo las directrices de FCC, el uso de la tecnología UWB puede aportar una enorme capacidad en las comunicaciones de corto alcance. Considerando las ecuaciones de capacidad de Shannon, incrementar la capacidad del canal requiere un incremento lineal en el ancho de banda, mientras que un aumento similar de la capacidad de canal requiere un aumento exponencial en la energía de transmisión. En los últimos años, s diferentes desarrollos del UWB han sido extensamente estudiados en diferentes áreas, entre los cuales, el protocolo de comunicaciones inalámbricas MB-OFDM UWB está considerado como la mejor elección y ha sido adoptado como estándar ISO/IEC para los WPANs. Combinando la modulación OFDM y la transmisión de datos utilizando las técnicas de salto de frecuencia, el sistema MB-OFDM UWB es capaz de soportar tasas de datos con que pueden variar de los 55 a los 480 Mbps, alcanzando una distancia máxima de hasta 10 metros. Se esperara que la tecnología MB-OFDM tenga un consumo energético muy bajo copando un are muy reducida en silicio, proporcionando soluciones de bajo coste que satisfagan las demandas del mercado. Para cumplir con todas estas expectativas, el desarrollo y la investigación del MBOFDM UWB deben enfrentarse a varios retos, como son la sincronización de alta sensibilidad, las restricciones de baja complejidad, las estrictas limitaciones energéticas, la escalabilidad y la flexibilidad. Tales retos requieren un procesamiento digital de la señal de última generación, capaz de desarrollar sistemas que puedan aprovechar por completo las ventajas del espectro UWB y proporcionar futuras aplicaciones inalámbricas en interiores. Esta tesis se centra en la completa optimización de un sistema de transceptor de banda base MB-OFDM UWB digital, cuyo objetivo es investigar y diseñar un subsistema de comunicación inalámbrica para la aplicación de las Redes de Sensores Inalámbricas Visuales. La complejidad inherente de los procesadores FFT/IFFT y el sistema de sincronización así como la alta frecuencia de operación para todos los elementos de procesamiento, se convierten en el cuello de la botella para el diseño y la implementación del sistema de UWB digital en base de banda basado en MB-OFDM de baja energía. El objetivo del transceptor propuesto es conseguir baja energía y baja complejidad bajo la premisa de un alto rendimiento. Las optimizaciones están realizadas tanto a nivel algorítmico como a nivel arquitectural para todos los elementos del sistema. Una arquitectura hardware eficiente en consumo se propone en primer lugar para aquellos módulos correspondientes a núcleos de computación. Para el procesado de la Transformada Rápida de Fourier (FFT/IFFT), se propone un algoritmo mixed-radix, basado en una arquitectura con pipeline y se ha desarrollado un módulo de Decodificador de Viterbi (VD) equilibrado en coste-velocidad con el objetivo de reducir el consumo energético e incrementar la velocidad de procesamiento. También se ha implementado un correlador signo-bit simple basado en la sincronización del tiempo de símbolo es presentado. Este correlador es usado para detectar y sincronizar los paquetes de OFDM de forma robusta y precisa. Para el desarrollo de los subsitemas de procesamiento y realizar la integración del sistema completo se han empleado tecnologías de última generación. El dispositivo utilizado para el sistema propuesto es una FPGA Virtex 5 XC5VLX110T del fabricante Xilinx. La validación el propuesta para el sistema transceptor se ha implementado en dicha placa de FPGA. En este trabajo se presenta un algoritmo, y una arquitectura, diseñado con filosofía de co-diseño hardware/software para el desarrollo de sistemas de FPGA complejos. El objetivo principal de la estrategia propuesta es de encontrar una metodología eficiente para el diseño de un sistema de FPGA configurable optimizado con el empleo del mínimo esfuerzo posible en el sistema de procedimiento de verificación, por tanto acelerar el periodo de desarrollo del sistema. La metodología de co-diseño presentada tiene la ventaja de ser fácil de usar, contiene todos los pasos desde la propuesta del algoritmo hasta la verificación del hardware, y puede ser ampliamente extendida para casi todos los tipos de desarrollos de FPGAs. En este trabajo se ha desarrollado sólo el sistema de transceptor digital de banda base por lo que la comprobación de señales transmitidas a través del canal inalámbrico en los entornos reales de comunicación sigue requiriendo componentes RF y un front-end analógico. No obstante, utilizando la metodología de co-simulación hardware/software citada anteriormente, es posible comunicar el sistema de transmisor y el receptor digital utilizando los modelos de canales propuestos por IEEE 802.15.3a, implementados en MATLAB. Por tanto, simplemente ajustando las características de cada modelo de canal, por ejemplo, un incremento del retraso y de la frecuencia central, podemos estimar el comportamiento del sistema propuesto en diferentes escenarios y entornos. Las mayores contribuciones de esta tesis son: • Se ha propuesto un nuevo algoritmo 128-puntos base mixto FFT usando la arquitectura pipeline multi-ruta. Los complejos multiplicadores para cada etapa de procesamiento son diseñados usando la arquitectura modificada shiftadd. Los sistemas word length y twiddle word length son comparados y seleccionados basándose en la señal para cuantización del SQNR y el análisis de energías. • El desempeño del procesador IFFT es analizado bajo diferentes situaciones aritméticas de bloques de punto flotante (BFP) para el control de desbordamiento, por tanto, para encontrar la arquitectura perfecta del algoritmo IFFT basado en el procesador FFT propuesto. • Para el sistema de receptor MB-OFDM UWB se ha empleado una sincronización del tiempo innovadora, de baja complejidad y esquema de compensación, que consiste en funciones de Detector de Paquetes (PD) y Estimación del Offset del tiempo. Simplificando el cross-correlation y maximizar las funciones probables solo a sign-bit, la complejidad computacional se ve reducida significativamente. • Se ha propuesto un sistema de decodificadores Viterbi de 64 estados de decisión-débil usando velocidad base-4 de arquitectura suma-comparaselecciona. El algoritmo Two-pointer Even también es introducido en la unidad de rastreador de origen con el objetivo de conseguir la eficiencia en el hardware. • Se han integrado varias tecnologías de última generación en el completo sistema transceptor basebanda , con el objetivo de implementar un sistema de comunicación UWB altamente optimizado. • Un diseño de flujo mejorado es propuesto para el complejo sistema de implementación, el cual puede ser usado para diseños de Cadena de puertas de campo programable general (FPGA). El diseño mencionado no sólo reduce dramáticamente el tiempo para la verificación funcional, sino también provee un análisis automático como los errores del retraso del output para el sistema de hardware implementado. • Un ambiente de comunicación virtual es establecido para la validación del propuesto sistema de transceptores MB-OFDM. Este método es provisto para facilitar el uso y la conveniencia de analizar el sistema digital de basebanda sin parte frontera analógica bajo diferentes ambientes de comunicación. Esta tesis doctoral está organizada en seis capítulos. En el primer capítulo se encuentra una breve introducción al campo del UWB, tanto relacionado con el proyecto como la motivación del desarrollo del sistema de MB-OFDM. En el capítulo 2, se presenta la información general y los requisitos del protocolo de comunicación inalámbrica MBOFDM UWB. En el capítulo 3 se habla de la arquitectura del sistema de transceptor digital MB-OFDM de banda base . El diseño del algoritmo propuesto y la arquitectura para cada elemento del procesamiento está detallado en este capítulo. Los retos de diseño del sistema que involucra un compromiso de discusión entre la complejidad de diseño, el consumo de energía, el coste de hardware, el desempeño del sistema, y otros aspectos. En el capítulo 4, se ha descrito la co-diseñada metodología de hardware/software. Cada parte del flujo del diseño será detallado con algunos ejemplos que se ha hecho durante el desarrollo del sistema. Aprovechando esta estrategia de diseño, el procedimiento de comunicación virtual es llevado a cabo para probar y analizar la arquitectura del transceptor propuesto. Los resultados experimentales de la co-simulación y el informe sintético de la implementación del sistema FPGA son reflejados en el capítulo 5. Finalmente, en el capítulo 6 se incluye las conclusiones y los futuros proyectos, y también los resultados derivados de este proyecto de doctorado. ABSTRACT In recent years, the Wireless Visual Sensor Network (WVSN) has drawn great interest in wireless communication research area. They enable a wealth of new applications such as building security control, image sensing, and target localization. However, nowadays wireless communication protocols (ZigBee, Wi-Fi, and Bluetooth for example) cannot fully satisfy the demands of high data rate, low power consumption, short range, and high robustness requirements. New communication protocol is highly desired for such kind of applications. The Ultra Wideband (UWB) wireless communication protocol, which has increased in importance for high data rate wireless communication field, are emerging as an important topic for WVSN research. UWB has emerged as a technology that offers great promise to satisfy the growing demand for low-cost, high-speed digital wireless indoor and home networks. The large bandwidth available, the potential for high data rate transmission, and the potential for low complexity and low power consumption, along with low implementation cost, all present a unique opportunity for UWB to become a widely adopted radio solution for future Wireless Personal Area Network (WPAN) applications. UWB is defined as any transmission that occupies a bandwidth of more than 20% of its center frequency, or more than 500 MHz. In 2002, the Federal Communications Commission (FCC) has mandated that UWB radio transmission can legally operate in the range from 3.1 to 10.6 GHz at a transmitter power of -41.3 dBm/Hz. Under the FCC guidelines, the use of UWB technology can provide enormous capacity over short communication ranges. Considering Shannon’s capacity equations, increasing the channel capacity requires linear increasing in bandwidth, whereas similar channel capacity increases would require exponential increases in transmission power. In recent years, several different UWB developments has been widely studied in different area, among which, the MB-OFDM UWB wireless communication protocol is considered to be the leading choice and has recently been adopted in the ISO/IEC standard for WPANs. By combing the OFDM modulation and data transmission using frequency hopping techniques, the MB-OFDM UWB system is able to support various data rates, ranging from 55 to 480 Mbps, over distances up to 10 meters. The MB-OFDM technology is expected to consume very little power and silicon area, as well as provide low-cost solutions that can satisfy consumer market demands. To fulfill these expectations, MB-OFDM UWB research and development have to cope with several challenges, which consist of high-sensitivity synchronization, low- complexity constraints, strict power limitations, scalability, and flexibility. Such challenges require state-of-the-art digital signal processing expertise to develop systems that could fully take advantages of the UWB spectrum and support future indoor wireless applications. This thesis focuses on fully optimization for the MB-OFDM UWB digital baseband transceiver system, aiming at researching and designing a wireless communication subsystem for the Wireless Visual Sensor Networks (WVSNs) application. The inherent high complexity of the FFT/IFFT processor and synchronization system, and high operation frequency for all processing elements, becomes the bottleneck for low power MB-OFDM based UWB digital baseband system hardware design and implementation. The proposed transceiver system targets low power and low complexity under the premise of high performance. Optimizations are made at both algorithm and architecture level for each element of the transceiver system. The low-power hardwareefficient structures are firstly proposed for those core computation modules, i.e., the mixed-radix algorithm based pipelined architecture is proposed for the Fast Fourier Transform (FFT/IFFT) processor, and the cost-speed balanced Viterbi Decoder (VD) module is developed, in the aim of lowering the power consumption and increasing the processing speed. In addition, a low complexity sign-bit correlation based symbol timing synchronization scheme is presented so as to detect and synchronize the OFDM packets robustly and accurately. Moreover, several state-of-the-art technologies are used for developing other processing subsystems and an entire MB-OFDM digital baseband transceiver system is integrated. The target device for the proposed transceiver system is Xilinx Virtex 5 XC5VLX110T FPGA board. In order to validate the proposed transceiver system in the FPGA board, a unified algorithm-architecture-circuit hardware/software co-design environment for complex FPGA system development is presented in this work. The main objective of the proposed strategy is to find an efficient methodology for designing a configurable optimized FPGA system by using as few efforts as possible in system verification procedure, so as to speed up the system development period. The presented co-design methodology has the advantages of easy to use, covering all steps from algorithm proposal to hardware verification, and widely spread for almost all kinds of FPGA developments. Because only the digital baseband transceiver system is developed in this thesis, the validation of transmitting signals through wireless channel in real communication environments still requires the analog front-end and RF components. However, by using the aforementioned hardware/software co-simulation methodology, the transmitter and receiver digital baseband systems get the opportunity to communicate with each other through the channel models, which are proposed from the IEEE 802.15.3a research group, established in MATLAB. Thus, by simply adjust the characteristics of each channel model, e.g. mean excess delay and center frequency, we can estimate the transmission performance of the proposed transceiver system through different communication situations. The main contributions of this thesis are: • A novel mixed radix 128-point FFT algorithm by using multipath pipelined architecture is proposed. The complex multipliers for each processing stage are designed by using modified shift-add architectures. The system wordlength and twiddle word-length are compared and selected based on Signal to Quantization Noise Ratio (SQNR) and power analysis. • IFFT processor performance is analyzed under different Block Floating Point (BFP) arithmetic situations for overflow control, so as to find out the perfect architecture of IFFT algorithm based on the proposed FFT processor. • An innovative low complex timing synchronization and compensation scheme, which consists of Packet Detector (PD) and Timing Offset Estimation (TOE) functions, for MB-OFDM UWB receiver system is employed. By simplifying the cross-correlation and maximum likelihood functions to signbit only, the computational complexity is significantly reduced. • A 64 state soft-decision Viterbi Decoder system by using high speed radix-4 Add-Compare-Select architecture is proposed. Two-pointer Even algorithm is also introduced into the Trace Back unit in the aim of hardware-efficiency. • Several state-of-the-art technologies are integrated into the complete baseband transceiver system, in the aim of implementing a highly-optimized UWB communication system. • An improved design flow is proposed for complex system implementation which can be used for general Field-Programmable Gate Array (FPGA) designs. The design method not only dramatically reduces the time for functional verification, but also provides automatic analysis such as errors and output delays for the implemented hardware systems. • A virtual communication environment is established for validating the proposed MB-OFDM transceiver system. This methodology is proved to be easy for usage and convenient for analyzing the digital baseband system without analog frontend under different communication environments. This PhD thesis is organized in six chapters. In the chapter 1 a brief introduction to the UWB field, as well as the related work, is done, along with the motivation of MBOFDM system development. In the chapter 2, the general information and requirement of MB-OFDM UWB wireless communication protocol is presented. In the chapter 3, the architecture of the MB-OFDM digital baseband transceiver system is presented. The design of the proposed algorithm and architecture for each processing element is detailed in this chapter. Design challenges of such system involve trade-off discussions among design complexity, power consumption, hardware cost, system performance, and some other aspects. All these factors are analyzed and discussed. In the chapter 4, the hardware/software co-design methodology is proposed. Each step of this design flow will be detailed by taking some examples that we met during system development. Then, taking advantages of this design strategy, the Virtual Communication procedure is carried out so as to test and analyze the proposed transceiver architecture. Experimental results from the co-simulation and synthesis report of the implemented FPGA system are given in the chapter 5. The chapter 6 includes conclusions and future work, as well as the results derived from this PhD work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work introduces a web-based learning environment to facilitate learning in Project Management. The proposed web-based support system integrates methodological procedures and information systems, allowing to promote learning among geographically-dispersed students. Thus, students who are enrolled in different universities at different locations and attend their own project management courses, share a virtual experience in executing and managing projects. Specific support systems were used or developed to automatically collect information about student activities, making it possible to monitor the progress made on learning and assess learning performance as established in the defined rubric.