907 resultados para Sub-tropical Design
Resumo:
This research work presents the design and implementation of a FFT pruning block, which is an extension to the FFT core for OFDM demodulation, enabling run-time 8 pruning of the FFT algorithm, without any restrictions on the distribution pattern of the active/inactive sub-carriers. The design and implementation of FFT processor core is not the part of this work. The whole design was prototyped on an ALTERA STRATIX V FPGA to evaluate the performance of the pruning engine. Synthesis and simulation results showed that the logic overhead introduced by the pruning block is limited to a 10% of the total resources utilization. Moreover, in presence of a medium-high scattering of the sub-carriers, power and energy consumption of the FFT core were reduced by a 30% factor.
Resumo:
Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.
Resumo:
Every year, thousand of surgical treatments are performed in order to fix up or completely substitute, where possible, organs or tissues affected by degenerative diseases. Patients with these kind of illnesses stay long times waiting for a donor that could replace, in a short time, the damaged organ or the tissue. The lack of biological alternates, related to conventional surgical treatments as autografts, allografts, e xenografts, led the researchers belonging to different areas to collaborate to find out innovative solutions. This research brought to a new discipline able to merge molecular biology, biomaterial, engineering, biomechanics and, recently, design and architecture knowledges. This discipline is named Tissue Engineering (TE) and it represents a step forward towards the substitutive or regenerative medicine. One of the major challenge of the TE is to design and develop, using a biomimetic approach, an artificial 3D anatomy scaffold, suitable for cells adhesion that are able to proliferate and differentiate themselves as consequence of the biological and biophysical stimulus offered by the specific tissue to be replaced. Nowadays, powerful instruments allow to perform analysis day by day more accurateand defined on patients that need more precise diagnosis and treatments.Starting from patient specific information provided by TC (Computed Tomography) microCT and MRI(Magnetic Resonance Imaging), an image-based approach can be performed in order to reconstruct the site to be replaced. With the aid of the recent Additive Manufacturing techniques that allow to print tridimensional objects with sub millimetric precision, it is now possible to practice an almost complete control of the parametrical characteristics of the scaffold: this is the way to achieve a correct cellular regeneration. In this work, we focalize the attention on a branch of TE known as Bone TE, whose the bone is main subject. Bone TE combines osteoconductive and morphological aspects of the scaffold, whose main properties are pore diameter, structure porosity and interconnectivity. The realization of the ideal values of these parameters represents the main goal of this work: here we'll a create simple and interactive biomimetic design process based on 3D CAD modeling and generative algorithmsthat provide a way to control the main properties and to create a structure morphologically similar to the cancellous bone. Two different typologies of scaffold will be compared: the first is based on Triply Periodic MinimalSurface (T.P.M.S.) whose basic crystalline geometries are nowadays used for Bone TE scaffolding; the second is based on using Voronoi's diagrams and they are more often used in the design of decorations and jewellery for their capacity to decompose and tasselate a volumetric space using an heterogeneous spatial distribution (often frequent in nature). In this work, we will show how to manipulate the main properties (pore diameter, structure porosity and interconnectivity) of the design TE oriented scaffolding using the implementation of generative algorithms: "bringing back the nature to the nature".
Resumo:
Objective To assess the outcome of patients who experienced treatment failure with antiretrovirals in sub-Saharan Africa. Methods Analysis of 11 antiretroviral therapy (ART) programmes in sub-Saharan Africa. World Health Organization (WHO) criteria were used to define treatment failure. All ART-naive patients aged ≥16 who started with a non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimen and had at least 6 months of follow-up were eligible. For each patient who switched to a second-line regimen, 10 matched patients who remained on a non-failing first-line regimen were selected. Time was measured from the time of switching, from the corresponding time in matched patients, or from the time of treatment failure in patients who remained on a failing regimen. Mortality was analysed using Kaplan–Meier curves and random-effects Cox models. Results Of 16 591 adult patients starting ART, 382 patients (2.3%) switched to a second-line regimen. Another 323 patients (1.9%) did not switch despite developing immunological or virological failure. Cumulative mortality at 1 year was 4.2% (95% CI 2.2–7.8%) in patients who switched to a second-line regimen and 11.7% (7.3%–18.5%) in patients who remained on a failing first-line regimen, compared to 2.2% (1.6–3.0%) in patients on a non-failing first-line regimen (P < 0.0001). Differences in mortality were not explained by nadir CD4 cell count, age or differential loss to follow up. Conclusions Many patients who meet criteria for treatment failure do not switch to a second-line regimen and die. There is an urgent need to clarify the reasons why in sub-Saharan Africa many patients remain on failing first-line ART.
Resumo:
Objectives To assess the proportion of patients lost to programme (died, lost to follow-up, transferred out) between HIV diagnosis and start of antiretroviral therapy (ART) in sub-Saharan Africa, and determine factors associated with loss to programme. Methods Systematic review and meta-analysis. We searched PubMed and EMBASE databases for studies in adults. Outcomes were the percentage of patients dying before starting ART, the percentage lost to follow-up, the percentage with a CD4 cell count, the distribution of first CD4 counts and the percentage of eligible patients starting ART. Data were combined using random-effects meta-analysis. Results Twenty-nine studies from sub-Saharan Africa including 148 912 patients were analysed. Six studies covered the whole period from HIV diagnosis to ART start. Meta-analysis of these studies showed that of the 100 patients with a positive HIV test, 72 (95% CI 60-84) had a CD4 cell count measured, 40 (95% CI 26-55) were eligible for ART and 25 (95% CI 13-37) started ART. There was substantial heterogeneity between studies (P < 0.0001). Median CD4 cell count at presentation ranged from 154 to 274 cells/μl. Patients eligible for ART were less likely to become lost to programme (25%vs. 54%, P < 0.0001), but eligible patients were more likely to die (11%vs. 5%, P < 0.0001) than ineligible patients. Loss to programme was higher in men, in patients with low CD4 cell counts and low socio-economic status and in recent time periods. Conclusions Monitoring and care in the pre-ART time period need improvement, with greater emphasis on patients not yet eligible for ART.
Resumo:
Currently, observations of space debris are primarily performed with ground-based sensors. These sensors have a detection limit at some centimetres diameter for objects in Low Earth Orbit (LEO) and at about two decimetres diameter for objects in Geostationary Orbit (GEO). The few space-based debris observations stem mainly from in-situ measurements and from the analysis of returned spacecraft surfaces. Both provide information about mostly sub-millimetre-sized debris particles. As a consequence the population of centimetre- and millimetre-sized debris objects remains poorly understood. The development, validation and improvement of debris reference models drive the need for measurements covering the whole diameter range. In 2003 the European Space Agency (ESA) initiated a study entitled “Space-Based Optical Observation of Space Debris”. The first tasks of the study were to define user requirements and to develop an observation strategy for a space-based instrument capable of observing uncatalogued millimetre-sized debris objects. Only passive optical observations were considered, focussing on mission concepts for the LEO, and GEO regions respectively. Starting from the requirements and the observation strategy, an instrument system architecture and an associated operations concept have been elaborated. The instrument system architecture covers the telescope, camera and onboard processing electronics. The proposed telescope is a folded Schmidt design, characterised by a 20 cm aperture and a large field of view of 6°. The camera design is based on the use of either a frame-transfer charge coupled device (CCD), or on a cooled hybrid sensor with fast read-out. A four megapixel sensor is foreseen. For the onboard processing, a scalable architecture has been selected. Performance simulations have been executed for the system as designed, focussing on the orbit determination of observed debris particles, and on the analysis of the object detection algorithms. In this paper we present some of the main results of the study. A short overview of the user requirements and observation strategy is given. The architectural design of the instrument is discussed, and the main tradeoffs are outlined. An insight into the results of the performance simulations is provided.
Resumo:
Large parts of the world are subjected to one or more natural hazards, such as earthquakes, tsunamis, landslides, tropical storms (hurricanes, cyclones and typhoons), costal inundation and flooding. Virtually the entire world is at risk of man-made hazards. In recent decades, rapid population growth and economic development in hazard-prone areas have greatly increased the potential of multiple hazards to cause damage and destruction of buildings, bridges, power plants, and other infrastructure; thus posing a grave danger to the community and disruption of economic and societal activities. Although an individual hazard is significant in many parts of the United States (U.S.), in certain areas more than one hazard may pose a threat to the constructed environment. In such areas, structural design and construction practices should address multiple hazards in an integrated manner to achieve structural performance that is consistent with owner expectations and general societal objectives. The growing interest and importance of multiple-hazard engineering has been recognized recently. This has spurred the evolution of multiple-hazard risk-assessment frameworks and development of design approaches which have paved way for future research towards sustainable construction of new and improved structures and retrofitting of the existing structures. This report provides a review of literature and the current state of practice for assessment, design and mitigation of the impact of multiple hazards on structural infrastructure. It also presents an overview of future research needs related to multiple-hazard performance of constructed facilities.
Resumo:
For the past sixty years, waveguide slot radiator arrays have played a critical role in microwave radar and communication systems. They feature a well-characterized antenna element capable of direct integration into a low-loss feed structure with highly developed and inexpensive manufacturing processes. Waveguide slot radiators comprise some of the highest performance—in terms of side-lobe-level, efficiency, etc. — antenna arrays ever constructed. A wealth of information is available in the open literature regarding design procedures for linearly polarized waveguide slots. By contrast, despite their presence in some of the earliest published reports, little has been presented to date on array designs for circularly polarized (CP) waveguide slots. Moreover, that which has been presented features a classic traveling wave, efficiency-reducing beam tilt. This work proposes a unique CP waveguide slot architecture which mitigates these problems and a thorough design procedure employing widely available, modern computational tools. The proposed array topology features simultaneous dual-CP operation with grating-lobe-free, broadside radiation, high aperture efficiency, and good return loss. A traditional X-Slot CP element is employed with the inclusion of a slow wave structure passive phase shifter to ensure broadside radiation without the need for performance-limiting dielectric loading. It is anticipated this technology will be advantageous for upcoming polarimetric radar and Ka-band SatCom systems. The presented design methodology represents a philosophical shift away from traditional waveguide slot radiator design practices. Rather than providing design curves and/or analytical expressions for equivalent circuit models, simple first-order design rules – generated via parametric studies — are presented with the understanding that device optimization and design will be carried out computationally. A unit-cell, S-parameter based approach provides a sufficient reduction of complexity to permit efficient, accurate device design with attention to realistic, application-specific mechanical tolerances. A transparent, start-to-finish example of the design procedure for a linear sub-array at X-Band is presented. Both unit cell and array performance is calculated via finite element method simulations. Results are confirmed via good agreement with finite difference, time domain calculations. Array performance exhibiting grating-lobe-free, broadside-scanned, dual-CP radiation with better than 20 dB return loss and over 75% aperture efficiency is presented.
Resumo:
One of the scarcest resources in the wireless communication system is the limited frequency spectrum. Many wireless communication systems are hindered by the bandwidth limitation and are not able to provide high speed communication. However, Ultra-wideband (UWB) communication promises a high speed communication because of its very wide bandwidth of 7.5GHz (3.1GHz-10.6GHz). The unprecedented bandwidth promises many advantages for the 21st century wireless communication system. However, UWB has many hardware challenges, such as a very high speed sampling rate requirement for analog to digital conversion, channel estimation, and implementation challenges. In this thesis, a new method is proposed using compressed sensing (CS), a mathematical concept of sub-Nyquist rate sampling, to reduce the hardware complexity of the system. The method takes advantage of the unique signal structure of the UWB symbol. Also, a new digital implementation method for CS based UWB is proposed. Lastly, a comparative study is done of the CS-UWB hardware implementation methods. Simulation results show that the application of compressed sensing using the proposed method significantly reduces the number of hardware complexity compared to the conventional method of using compressed sensing based UWB receiver.
Resumo:
OBJECTIVES: To describe temporal trends in baseline clinical characteristics, initial treatment regimens and monitoring of patients starting antiretroviral therapy (ART) in resource-limited settings. METHODS: We analysed data from 17 ART programmes in 12 countries in sub-Saharan Africa, South America and Asia. Patients aged 16 years or older with documented date of start of highly active ART (HAART) were included. Data were analysed by calculating medians, interquartile ranges (IQR) and percentages by regions and time periods. Not all centres provided data for 2006 and 2005 and 2006 were therefore combined. RESULTS: A total of 36,715 patients who started ART 1996-2006 were included in the analysis. Patient numbers increased substantially in sub-Saharan Africa and Asia, and the number of initial regimens declined, to four and five, respectively, in 2005-2006. In South America 20 regimes were used in 2005-2006. A combination of 3TC/D4T/NVP was used for 56% of African patients and 42% of Asian patients; AZT/3TC/EFV was used in 33% of patients in South America. The median baseline CD4 count increased in recent years, to 122 cells/microl (IQR 53-194) in 2005-2006 in Africa, 134 cells/microl (IQR 72-191) in Asia, and 197 cells/microl (IQR 61-277) in South America, but 77%, 78% and 51%, respectively, started with <200 cells/microl in 2005-2006. In all regions baseline CD4 cell counts were higher in women than men: differences were 22cells/microl in Africa, 65 cells/microl in Asia and 10 cells/microl in South America. In 2005-2006 a viral load at 6 months was available in 21% of patients Africa, 8% of Asian patients and 73% of patients in South America. Corresponding figures for 6-month CD4 cell counts were 74%, 77% and 81%. CONCLUSIONS: The public health approach to providing ART proposed by the World Health Organization has been implemented in sub-Saharan Africa and Asia. Although CD4 cell counts at the start of ART have increased in recent years, most patients continue to start with counts well below the recommended threshold. Particular attention should be paid to more timely initiation of ART in HIV-infected men.
Resumo:
The dynamics of aseasonal lowland dipterocarp forest in Borneo is influenced by perturbation from droughts. These events might be increasing in frequency and intensity in the future. This paper describes drought-affected dynamics between 1986 and 2001 in Sabah, Malaysia, and considers how it is possible, reliably and accurately, to measure both coarse- and fine-scale responses of the forest. Some fundamental concerns about methodology and data analysis emerge. In two plots forming 8 ha, mortality, recruitment, and stem growth rates of trees ≥10 cm gbh (girth at breast height) were measured in a ‘pre-drought’ period (1986–1996), and in a period (1996–2001) including the 1997–1998 ENSO-drought. For 2.56 ha of subplots, mortality and growth rates of small trees (10–<50 cm gbh) were found also for two sub-periods (1996–1999, 1999–2001). A total of c. 19 K trees were recorded. Mortality rate increased by 25% while both recruitment and relative growth rates increased by 12% for all trees at the coarse scale. For small trees, at the fine scale, mortality increased by 6% and 9% from pre-drought to drought and on to ‘post-drought’ sub-periods. Relative growth rates correspondingly decreased by 38% and increased by 98%. Tree size and topography interacted in a complex manner with between-plot differences. The forest appears to have been sustained by off-setting elevated tree mortality by highly resilient stem growth. This last is seen as the key integrating tree variable which links the external driver (drought causing water stress) and population dynamics recorded as mortality and recruitment. Suitably sound measurements of stem girth, leading to valid growth rates, are needed to understand and model tree dynamic responses to perturbations. The proportion of sound data, however, is in part determined by the drought itself.
Resumo:
Drought perturbation driven by the El Niño Southern Oscillation (ENSO) is a principal stochastic variable determining the dynamics of lowland rain forest in S.E. Asia. Mortality, recruitment and stem growth rates at Danum in Sabah (Malaysian Borneo) were recorded in two 4-ha plots (trees ≥ 10 cm gbh) for two periods, 1986–1996 and 1996–2001. Mortality and growth were also recorded in a sample of subplots for small trees (10 to <50 cm gbh) in two sub-periods, 1996–1999 and 1999–2001. Dynamics variables were employed to build indices of drought response for each of the 34 most abundant plot-level species (22 at the subplot level), these being interval-weighted percentage changes between periods and sub-periods. A significant yet complex effect of the strong 1997/1998 drought at the forest community level was shown by randomization procedures followed by multiple hypothesis testing. Despite a general resistance of the forest to drought, large and significant differences in short-term responses were apparent for several species. Using a diagrammatic form of stability analysis, different species showed immediate or lagged effects, high or low degrees of resilience or even oscillatory dynamics. In the context of the local topographic gradient, species’ responses define the newly termed perturbation response niche. The largest responses, particularly for recruitment and growth, were among the small trees, many of which are members of understorey taxa. The results bring with them a novel approach to understanding community dynamics: the kaleidoscopic complexity of idiosyncratic responses to stochastic perturbations suggests that plurality, rather than neutrality, of responses may be essential to understanding these tropical forests. The basis to the various responses lies with the mechanisms of tree-soil water relations which are physiologically predictable: the timing and intensity of the next drought, however, is not. To date, environmental stochasticity has been insufficiently incorporated into models of tropical forest dynamics, a step that might considerably improve the reality of theories about these globally important ecosystems.
Resumo:
The rate of destruction of tropical forests continues to accelerate at an alarming rate contributing to an important fraction of overall greenhouse gas emissions. In recent years, much hope has been vested in the emerging REDD+ framework under the UN Framework Convention on Climate Change (UNFCCC), which aims at creating an international incentive system to reduce emissions from deforestation and forest degradation. This paper argues that in the absence of an international consensus on the design of results-based payments, “bottom-up” initiatives should take the lead and explore new avenues. It suggests that a call for tender for REDD+ credits might both assist in leveraging private investments and spending scarce public funds in a cost-efficient manner. The paper discusses the pros and cons of results-based approaches, provides an overview of the goals and principles that govern public procurement and discusses their relevance for the purchase of REDD+ credits, in particular within the ambit of the European Union.
Resumo:
OBJECTIVES Cotrimoxazole prophylactic treatment (CPT) prevents opportunistic infections in HIV-infected or HIV-exposed children, but estimates of the effectiveness in preventing malaria vary. We reviewed studies that examined the effect of CPT on incidence of malaria in children in sub-Saharan Africa. METHODS We searched PubMed and EMBASE for randomised controlled trials (RCTs) and cohort studies on the effect of CPT on incidence of malaria and mortality in children and extracted data on the prevalence of sulphadoxine-pyrimethamine resistance-conferring point mutations. Incidence rate ratios (IRR) from individual studies were combined using random effects meta-analysis; confounder-adjusted estimates were used for cohort studies. The importance of resistance was examined in meta-regression analyses. RESULTS Three RCTs and four cohort studies with 5039 children (1692 HIV-exposed; 2800 HIV-uninfected; 1486 HIV-infected) were included. Children on CPT were less likely to develop clinical malaria episodes than those without prophylaxis (combined IRR 0.37, 95% confidence interval: 0.21-0.66), but there was substantial between-study heterogeneity (I-squared = 94%, P < 0.001). The protective efficacy of CPT was highest in an RCT from Mali, where the prevalence of antifolate resistant plasmodia was low. In meta-regression analyses, there was some evidence that the efficacy of CPT declined with increasing levels of resistance. Mortality was reduced with CPT in an RCT from Zambia, but not in a cohort study from Côte d'Ivoire. CONCLUSIONS Cotrimoxazole prophylactic treatment reduces incidence of malaria and mortality in children in sub-Saharan Africa, but study designs, settings and results were heterogeneous. CPT appears to be beneficial for HIV-infected and HIV-exposed as well as HIV-uninfected children.
Resumo:
Simulating the spatio-temporal dynamics of inundation is key to understanding the role of wetlands under past and future climate change. Earlier modelling studies have mostly relied on fixed prescribed peatland maps and inundation time series of limited temporal coverage. Here, we describe and assess the the Dynamical Peatland Model Based on TOPMODEL (DYPTOP), which predicts the extent of inundation based on a computationally efficient TOPMODEL implementation. This approach rests on an empirical, grid-cell-specific relationship between the mean soil water balance and the flooded area. DYPTOP combines the simulated inundation extent and its temporal persistency with criteria for the ecosystem water balance and the modelled peatland-specific soil carbon balance to predict the global distribution of peatlands. We apply DYPTOP in combination with the LPX-Bern DGVM and benchmark the global-scale distribution, extent, and seasonality of inundation against satellite data. DYPTOP successfully predicts the spatial distribution and extent of wetlands and major boreal and tropical peatland complexes and reveals the governing limitations to peatland occurrence across the globe. Peatlands covering large boreal lowlands are reproduced only when accounting for a positive feedback induced by the enhanced mean soil water holding capacity in peatland-dominated regions. DYPTOP is designed to minimize input data requirements, optimizes computational efficiency and allows for a modular adoption in Earth system models.