892 resultados para Understanding by Design
Resumo:
The flow rates of drying and nebulizing gas, heat block and desolvation line temperatures and interface voltage are potential electrospray ionization parameters as they may enhance sensitivity of the mass spectrometer. The conditions that give higher sensitivity of 13 pharmaceuticals were explored. First, Plackett-Burman design was implemented to screen significant factors, and it was concluded that interface voltage and nebulizing gas flow were the only factors that influence the intensity signal for all pharmaceuticals. This fractionated factorial design was projected to set a full 2(2) factorial design with center points. The lack-of-fit test proved to be significant. Then, a central composite face-centered design was conducted. Finally, a stepwise multiple linear regression and subsequently an optimization problem solving were carried out. Two main drug clusters were found concerning the signal intensities of all runs of the augmented factorial design. p-Aminophenol, salicylic acid, and nimesulide constitute one cluster as a result of showing much higher sensitivity than the remaining drugs. The other cluster is more homogeneous with some sub-clusters comprising one pharmaceutical and its respective metabolite. It was observed that instrumental signal increased when both significant factors increased with maximum signal occurring when both codified factors are set at level +1. It was also found that, for most of the pharmaceuticals, interface voltage influences the intensity of the instrument more than the nebulizing gas flowrate. The only exceptions refer to nimesulide where the relative importance of the factors is reversed and still salicylic acid where both factors equally influence the instrumental signal. Graphical Abstract ᅟ.
Resumo:
Historically the imaginary and the hegemonic thinking, in the Western globe north, has been marked by the epistemology and capitalists archetypes. Notwithstanding the design seem as a practice and discipline shielded on a simplistic discourse of functional / communicative efficiency, wandering through by multiple aestheticism apparently neutral in relation to the symbolic, but in fact they never are, because what really hapens is that the aesthetic appearance of the generated forms will always be a review of the powers ruling. We start from the understanding that the act of creating an aesthetic artifact, will also be a movement of inscription in a discursive platform (that precedes it), is in itself an narrative act and that fact represent a certain take place in relation to certain symbolic reality. On reflection shown if it sees design as a discipline and / or an instrument of action, whose operational relevance tends to question and simultaneously rehearsing a response, in which more than why interests answer to why. Apparently the design is a content mediator, but also, it is structure, is body, is idea. We think a design praxis as discipline and enrollment tool of critical thought and social transformation. For guiding research in this text, we propose the following question: Can the Design want for themselves an engagement with the symbolic in order to be an active part in the production of critical thinking in the place where it belongs? Methodologically our argument will be present in two differents moments: 1. a first, exploratory nature where we rescue the draw issues in the practice of design and 2. a second analytical nature concerning the subject issues (graphic and / or utility ) design and how it incorporates formal rites, political events and social practices of contemporary everyday life. We consider the praxis of design as a discipline and critical thinking enrollment tool as agents of social transformation. With this study we seek for contribute phenomenology design by studying the artifacts of configuration as well as the possible messages they convey and what impact they may have on the social network.
Resumo:
This thesis, titled Governance and Community Capitals, explores the kinds of practical processes that have made governance work in three faith-based schools in the Western Highlands of Papua New Guinea (PNG). To date, the nation of PNG has been unable to meet its stated educational goals; however, some faith-based primary schools have overcome educational challenges by changing their local governance systems. What constitutes good governance in developing countries and how it can be achieved in a PNG schooling context has received very little scholarly attention. In this study, the subject of governance is approached at the nexus between the administrative sciences and asset-based community development. In this space, the researcher provides an understanding of the contribution that community capitals have made to understandings of local forms of governance in the development context. However, by and large, conceptions of governance have a history of being positioned within a Euro-centric frame and very little, if anything is known about the naming of capitals by indigenous peoples. In this thesis, six indigenous community capitals are made visible, expanding the repertoire of extant capitals published to date. The capitals identified and named in this thesis are: Story, Wisdom, Action, Blessing, Name and Unity. In-depth insights into these capitals are provided and through the theoretical idea of performativity, the researcher advances an understanding of how the habitual enactment of the practical components of the capitals made governance work in this unique setting. The study draws from a grounded and appreciative methodology and is based on a case study design incorporating a three-stage cycle of investigation. The first stage tested the application of an assets-based method to documentary sources of data including most significant change stories, community mapping and visual diaries. In the second stage, a group process method relevant to a PNG context was developed and employed. The third stage involved building theory from case study evidence using content analysis, language and metaphorical speech acts as guides for complex analysis. The thesis demonstrates the contribution that indigenous community capitals can make to understanding local forms of governance and how PNG faith-based schools meet their local governance challenges.
Resumo:
Tomato (Lycopersicon esculentum Mill.), apart from being a functional food rich in carotenoids, vitamins and minerals, is also an important source of phenolic compounds [1 ,2]. As antioxidants, these functional molecules play an important role in the prevention of human pathologies and have many applications in nutraceutical, pharmaceutical and cosmeceutical industries. Therefore, the recovery of added-value phenolic compounds from natural sources, such as tomato surplus or industrial by-products, is highly desirable. Herein, the microwave-assisted extraction of the main phenolic acids and flavonoids from tomato was optimized. A S-Ieve! full factorial Box-Behnken design was implemented and response surface methodology used for analysis. The extraction time (0-20 min), temperature (60-180 "C), ethanol percentage (0-100%), solidlliquid ratio (5-45 g/L) and microwave power (0-400 W) were studied as independent variables. The phenolic profile of the studied tomato variety was initially characterized by HPLC-DAD-ESIIMS [2]. Then, the effect of the different extraction conditions, as defined by the used experimental design, on the target compounds was monitored by HPLC-DAD, using their UV spectra and retention time for identification and a series of calibrations based on external standards for quantification. The proposed model was successfully implemented and statistically validated. The microwave power had no effect on the extraction process. Comparing with the optimal extraction conditions for flavonoids, which demanded a short processing time (2 min), a low temperature (60 "C) and solidlliquid ratio (5 g/L), and pure ethanol, phenolic acids required a longer processing time ( 4.38 min), a higher temperature (145.6 •c) and solidlliquid ratio (45 g/L), and water as extraction solvent. Additionally, the studied tomato variety was highlighted as a source of added-value phenolic acids and flavonoids.
Resumo:
The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.
Resumo:
Biogeochemical-Argo is the extension of the Argo array of profiling floats to include floats that are equipped with biogeochemical sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance. Argo is a highly regarded, international program that measures the changing ocean temperature (heat content) and salinity with profiling floats distributed throughout the ocean. Newly developed sensors now allow profiling floats to also observe biogeochemical properties with sufficient accuracy for climate studies. This extension of Argo will enable an observing system that can determine the seasonal to decadal-scale variability in biological productivity, the supply of essential plant nutrients from deep-waters to the sunlit surface layer, ocean acidification, hypoxia, and ocean uptake of CO2. Biogeochemical-Argo will drive a transformative shift in our ability to observe and predict the effects of climate change on ocean metabolism, carbon uptake, and living marine resource management. Presently, vast areas of the open ocean are sampled only once per decade or less, with sampling occurring mainly in summer. Our ability to detect changes in biogeochemical processes that may occur due to the warming and acidification driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by this undersampling. In close synergy with satellite systems (which are effective at detecting global patterns for a few biogeochemical parameters, but only very close to the sea surface and in the absence of clouds), a global array of biogeochemical sensors would revolutionize our understanding of ocean carbon uptake, productivity, and deoxygenation. The array would reveal the biological, chemical, and physical events that control these processes. Such a system would enable a new generation of global ocean prediction systems in support of carbon cycling, acidification, hypoxia and harmful algal blooms studies, as well as the management of living marine resources. In order to prepare for a global Biogeochemical-Argo array, several prototype profiling float arrays have been developed at the regional scale by various countries and are now operating. Examples include regional arrays in the Southern Ocean (SOCCOM ), the North Atlantic Sub-polar Gyre (remOcean ), the Mediterranean Sea (NAOS ), the Kuroshio region of the North Pacific (INBOX ), and the Indian Ocean (IOBioArgo ). For example, the SOCCOM program is deploying 200 profiling floats with biogeochemical sensors throughout the Southern Ocean, including areas covered seasonally with ice. The resulting data, which are publically available in real time, are being linked with computer models to better understand the role of the Southern Ocean in influencing CO2 uptake, biological productivity, and nutrient supply to distant regions of the world ocean. The success of these regional projects has motivated a planning meeting to discuss the requirements for and applications of a global-scale Biogeochemical-Argo program. The meeting was held 11-13 January 2016 in Villefranche-sur-Mer, France with attendees from eight nations now deploying Argo floats with biogeochemical sensors present to discuss this topic. In preparation, computer simulations and a variety of analyses were conducted to assess the resources required for the transition to a global-scale array. Based on these analyses and simulations, it was concluded that an array of about 1000 biogeochemical profiling floats would provide the needed resolution to greatly improve our understanding of biogeochemical processes and to enable significant improvement in ecosystem models. With an endurance of four years for a Biogeochemical-Argo float, this system would require the procurement and deployment of 250 new floats per year to maintain a 1000 float array. The lifetime cost for a Biogeochemical-Argo float, including capital expense, calibration, data management, and data transmission, is about $100,000. A global Biogeochemical-Argo system would thus cost about $25,000,000 annually. In the present Argo paradigm, the US provides half of the profiling floats in the array, while the EU, Austral/Asia, and Canada share most the remaining half. If this approach is adopted, the US cost for the Biogeochemical-Argo system would be ~$12,500,000 annually and ~$6,250,000 each for the EU, and Austral/Asia and Canada. This includes no direct costs for ship time and presumes that float deployments can be carried out from future research cruises of opportunity, including, for example, the international GO-SHIP program (http://www.go-ship.org). The full-scale implementation of a global Biogeochemical-Argo system with 1000 floats is feasible within a decade. The successful, ongoing pilot projects have provided the foundation and start for such a system.
Resumo:
Integrated circuit scaling has enabled a huge growth in processing capability, which necessitates a corresponding increase in inter-chip communication bandwidth. As bandwidth requirements for chip-to-chip interconnection scale, deficiencies of electrical channels become more apparent. Optical links present a viable alternative due to their low frequency-dependent loss and higher bandwidth density in the form of wavelength division multiplexing. As integrated photonics and bonding technologies are maturing, commercialization of hybrid-integrated optical links are becoming a reality. Increasing silicon integration leads to better performance in optical links but necessitates a corresponding co-design strategy in both electronics and photonics. In this light, holistic design of high-speed optical links with an in-depth understanding of photonics and state-of-the-art electronics brings their performance to unprecedented levels. This thesis presents developments in high-speed optical links by co-designing and co-integrating the primary elements of an optical link: receiver, transmitter, and clocking.
In the first part of this thesis a 3D-integrated CMOS/Silicon-photonic receiver will be presented. The electronic chip features a novel design that employs a low-bandwidth TIA front-end, double-sampling and equalization through dynamic offset modulation. Measured results show -14.9dBm of sensitivity and energy efficiency of 170fJ/b at 25Gb/s. The same receiver front-end is also used to implement source-synchronous 4-channel WDM-based parallel optical receiver. Quadrature ILO-based clocking is employed for synchronization and a novel frequency-tracking method that exploits the dynamics of IL in a quadrature ring oscillator to increase the effective locking range. An adaptive body-biasing circuit is designed to maintain the per-bit-energy consumption constant across wide data-rates. The prototype measurements indicate a record-low power consumption of 153fJ/b at 32Gb/s. The receiver sensitivity is measured to be -8.8dBm at 32Gb/s.
Next, on the optical transmitter side, three new techniques will be presented. First one is a differential ring modulator that breaks the optical bandwidth/quality factor trade-off known to limit the speed of high-Q ring modulators. This structure maintains a constant energy in the ring to avoid pattern-dependent power droop. As a first proof of concept, a prototype has been fabricated and measured up to 10Gb/s. The second technique is thermal stabilization of micro-ring resonator modulators through direct measurement of temperature using a monolithic PTAT temperature sensor. The measured temperature is used in a feedback loop to adjust the thermal tuner of the ring. A prototype is fabricated and a closed-loop feedback system is demonstrated to operate at 20Gb/s in the presence of temperature fluctuations. The third technique is a switched-capacitor based pre-emphasis technique designed to extend the inherently low bandwidth of carrier injection micro-ring modulators. A measured prototype of the optical transmitter achieves energy efficiency of 342fJ/bit at 10Gb/s and the wavelength stabilization circuit based on the monolithic PTAT sensor consumes 0.29mW.
Lastly, a first-order frequency synthesizer that is suitable for high-speed on-chip clock generation will be discussed. The proposed design features an architecture combining an LC quadrature VCO, two sample-and-holds, a PI, digital coarse-tuning, and rotational frequency detection for fine-tuning. In addition to an electrical reference clock, as an extra feature, the prototype chip is capable of receiving a low jitter optical reference clock generated by a high-repetition-rate mode-locked laser. The output clock at 8GHz has an integrated RMS jitter of 490fs, peak-to-peak periodic jitter of 2.06ps, and total RMS jitter of 680fs. The reference spurs are measured to be –64.3dB below the carrier frequency. At 8GHz the system consumes 2.49mW from a 1V supply.
Resumo:
Free-draining bioretention systems commonly demonstrate poor nitrate removal. In this study, column tests verified the necessity of a permanently saturated zone to target nitrate removal via denitrification. Experiments determined a first-order denitrification rate constant of 0.0011 min-1 specific to Willow Oak woodchip media. A 2.6-day retention time reduced 3.0 mgN/L to below 0.05 mg-N/L. During simulated storm events, hydraulic retention time may be used as a predictive measurement of nitrate fate and removal. A minimum 4.0 hour retention time was necessary for in-storm denitrification defined by a minimum 20% nitrate removal. Additional environmental parameters, e.g., pH, temperature, oxidation-reduction potential, and dissolved oxygen, affect denitrification rate and response, but macroscale measurements may not be an accurate depiction of denitrifying biofilm conditions. A simple model was developed to predict annual bioretention nitrate performance. Novel bioretention design should incorporate bowl storage and large subsurface denitrifying zones to maximize treatment volume and contact time.
Resumo:
This paper discusses how to design a Radial Line Slot Antenna (RLSA) whose waveguide is filled with high loss dielectric materials. We introduce a new design for the aperture slot coupling synthesis to restrain the dielectric losses and improve the antenna gain. Based on a newly defined slot coupling, a number of RLSAs with different sizes and loss factors are analyzed and their performances are predicted. Theoretical calculations suggest that the gain is sensitive to the material losses in the radial lines. The gain enhancement by using the new coupling formula is notable for larger antenna size and higher loss factor of the dielectric material. Three prototype RLSAs are designed and fabricated at 60GHz following different slot coupling syntheses, and their measured performances consolidate our theory.
Resumo:
Tese de Doutoramento, Química, Especialização em Química Orgânica, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2016
Resumo:
Dissertação para obtenção do grau de Mestre em Design de Comunicação apresentada na Universidade de Lisboa - Faculdade de Arquitetura.
Resumo:
Adequacy of nutritional intake during the postoperative period, as measured by a change in weight-for-age z-scores from surgery to the time of discharge, was evaluated in infants (n = 58) diagnosed with a congenital heart defect and admitted for surgical intervention at Miami Children’s Hospital using a prospective observational study design. Parental consent was obtained for all infants who participated in the study. ^ Forty patients had a weight available at hospital discharge. The mean preoperative weight-for-age z-score was -1.3 ±1.43 and the mean weight-for-age z-score at hospital discharge was -1.89 ±1.35 with a mean difference of 0.58 ±0.5 (P = 0.2).^ Nutritional intake during the postoperative period was inadequate based on a decrease in weight-for-age z-scores from the time of surgery until discharged home. Our findings suggested that limited fluid volume for nutrition likely contributes to suboptimal nutritional delivery during the postoperative period; however, inadequate nutrition prescription may also be an important contributing factor. Development of a nutrition protocol for initiation and advancement of nutrition support may reduce the delay in achieving patient’s nutritional goals and may attenuate the observed decrease in z-scores during the postoperative period.^
Resumo:
Legumes are bee-pollinated, but to a different extent. The importance of the plant– pollinator interplay (PPI), in flowering crops such as legumes lies in a combination of the importance of pollination for the production service and breeding strategies, plus the increasing urgency in mitigating the decline of pollinators through the development and implementation of conservation measures. To realize the full potential of the PPI, a multidisciplinary approach is required. This article assembles an international team of genebank managers, geneticists, plant breeders, experts on environmental governance and agro-ecology, and comprises several sections. The contributions in these sections outline both the state of the art of knowledge in the field and the novel aspects under development, and encompass a range of reviews, opinions and perspectives. The first three sections explore the role of PPI in legume breeding strategies. PPI based approaches to crop improvement can make it possible to adapt and re-design breeding strategies to meet both goals of: (1) optimal productivity, based on an efficient use of pollinators, and (2) biodiversity conservation. The next section deals with entomological aspects and focuses on the protection of the “pest control service” and pollinators in legume crops. The final section addresses general approaches to encourage the synergybetweenfoodproductionandpollinationservicesatfarmerfieldlevel.Twobasic approaches are proposed: (a) Farming with Alternative Pollinators and (b) Crop Design System.
Resumo:
Thesis (Ph.D, Education) -- Queen's University, 2016-09-22 22:05:24.246
Resumo:
Since children already use and explore applications on smartphones, we use this as the starting point for design. Our monitoring and analysis framework, BaranC, enables us to discover and analyse which applications children uses and precisely how they interact with them. The monitoring happens unobtrusively in the background so children interact normally in their own natural environment without artificial constraints. Thus, we can discover to what extent a child of a particular age engages with, and how they physically interact with, existing applications. This information in turn provides the basis for design of new child-centred applications which can then be subject to the same comprehensive child use analysis using our framework. The work focuses on the first aspect, namely, the monitoring and analysis of current child use of smartphones. Experiments show the value of this approach and interesting results have been obtained from this precise monitoring of child smartphone usage.