918 resultados para Dosimetry, microdosimetry, neutron beams, silicon on insulator technology
Resumo:
A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.
Resumo:
In this paper, we develop an energy-efficient resource-allocation scheme with proportional fairness for downlink multiuser orthogonal frequency-division multiplexing (OFDM) systems with distributed antennas. Our aim is to maximize energy efficiency (EE) under the constraints of the overall transmit power of each remote access unit (RAU), proportional fairness data rates, and bit error rates (BERs). Because of the nonconvex nature of the optimization problem, obtaining the optimal solution is extremely computationally complex. Therefore, we develop a low-complexity suboptimal algorithm, which separates subcarrier allocation and power allocation. For the low-complexity algorithm, we first allocate subcarriers by assuming equal power distribution. Then, by exploiting the properties of fractional programming, we transform the nonconvex optimization problem in fractional form into an equivalent optimization problem in subtractive form, which includes a tractable solution. Next, an optimal energy-efficient power-allocation algorithm is developed to maximize EE while maintaining proportional fairness. Through computer simulation, we demonstrate the effectiveness of the proposed low-complexity algorithm and illustrate the fundamental trade off between energy and spectral-efficient transmission designs.
Resumo:
In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. Such a timing mismatch may cause rank deficiency of the conventional space-time codes and, thus, performance degradation. One efficient way to overcome such an issue is the delay-tolerant space-time codes (DT-STCs). The existing DT-STCs are designed assuming that the transmitter has no knowledge about the channels. In this paper, we show how the performance of DT-STCs can be improved by utilizing some feedback information. A general framework for designing DT-STC with limited feedback is first proposed, allowing for flexible system parameters such as the number of transmit/receive antennas, modulated symbols, and the length of codewords. Then, a new design method is proposed by combining Lloyd's algorithm and the stochastic gradient-descent algorithm to obtain optimal codebook of STCs, particularly for systems with linear minimum-mean-square-error receiver. Finally, simulation results confirm the performance of the newly designed DT-STCs with limited feedback.
Resumo:
Cities globally are in the midst of taking action to reduce greenhouse gas (GHG) emissions. After the vital step of emissions quantification, strategies must be developed to detail how emissions reductions targets will be achieved. The Pathways to Urban Reductions in Greenhouse Gas Emissions (PURGE) model allows the estimation of emissions from four pertinent urban sectors: electricity generation, buildings, private transportation, and waste. Additionally, the carbon storage from urban and regional forests is modeled. An emissions scenario is examined for a case study of the greater Toronto, Ontario, Canada, area using data on current technology stocks and government projections for stock change. The scenario presented suggests that even with some aggressive targets for technological adoption (especially in the transportation sector), it will be difficult to achieve the less ambitious 2050 emissions reduction goals of the Intergovernmental Panel on Climate Change. This is largely attributable to the long life of the building stock and limitations of current retrofit practices. Additionally, demand reduction (through transportation mode shifting and building occupant behavior) will be an important component of future emissions cuts.
Resumo:
David Arnold who retired this year as the Professor of Asian and Global History at the University of Warwick remains one of the most prolific historians of colonial medicine and modern South Asia. A founding member of the subaltern studies collective, he is considered widely as a pioneer in the histories of colonial medicine, environment, penology, hunger and famines within South Asian studies and beyond. In this interview he recalls his formative inspirations, ideological motivations and reflects critically on his earlier works, explaining various shifts as well as map- ping the possible course of future work. He talks at length about his forthcoming works on everyday technology, food and monsoon Asia. Finally, he shares with us his desire of initiating work on an ambitious project about the twin themes of poison and poverty in South Asian his- tory, beginning with the Bengal famine in the late eighteenth century and ending with the Bhopal gas tragedy of the early 1980s. This conversation provides insights into the ways in which the field of medical history in modern South Asia has been shaped over the past three decades through interactions with broader discussions on agency, resistance, power, everydayness, subal- tern studies, global and spatial histories. It hints further at the newer directions which are being opened up by such persisting intellectual entanglements.
Resumo:
The l1-norm sparsity constraint is a widely used technique for constructing sparse models. In this contribution, two zero-attracting recursive least squares algorithms, referred to as ZA-RLS-I and ZA-RLS-II, are derived by employing the l1-norm of parameter vector constraint to facilitate the model sparsity. In order to achieve a closed-form solution, the l1-norm of the parameter vector is approximated by an adaptively weighted l2-norm, in which the weighting factors are set as the inversion of the associated l1-norm of parameter estimates that are readily available in the adaptive learning environment. ZA-RLS-II is computationally more efficient than ZA-RLS-I by exploiting the known results from linear algebra as well as the sparsity of the system. The proposed algorithms are proven to converge, and adaptive sparse channel estimation is used to demonstrate the effectiveness of the proposed approach.
Resumo:
Causing civilian casualties during military operations has become a much politicised topic in international relations since the Second World War. Since the last decade of the 20th century, different scholars and political analysts have claimed that human life is valued more and more among the general international community. This argument has led many researchers to assume that democratic culture and traditions, modern ethical and moral issues have created a desire for a world without war or, at least, a demand that contemporary armed conflicts, if unavoidable, at least have to be far less lethal forcing the military to seek new technologies that can minimise civilian casualties and collateral damage. Non-Lethal Weapons (NLW) – weapons that are intended to minimise civilian casualties and collateral damage – are based on the technology that, during the 1990s, was expected to revolutionise the conduct of warfare making it significantly less deadly. The rapid rise of interest in NLW, ignited by the American military twenty five years ago, sparked off an entirely new military, as well as an academic, discourse concerning their potential contribution to military success on the 21st century battlefields. It seems, however, that except for this debate, very little has been done within the military forces themselves. This research suggests that the roots of this situation are much deeper than the simple professional misconduct of the military establishment, or the poor political behaviour of political leaders, who had sent them to fight. Following the story of NLW in the U.S., Russia and Israel this research focuses on the political and cultural aspects that have been supposed to force the military organisations of these countries to adopt new technologies and operational and organisational concepts regarding NLW in an attempt to minimise enemy civilian casualties during their military operations. This research finds that while American, Russian and Israeli national characters are, undoubtedly, products of the unique historical experience of each one of these nations, all of three pay very little regard to foreigners’ lives. Moreover, while it is generally argued that the international political pressure is a crucial factor that leads to the significant reduction of harmed civilians and destroyed civilian infrastructure, the findings of this research suggest that the American, Russian and Israeli governments are well prepared and politically equipped to fend off international criticism. As the analyses of the American, Russian and Israeli cases reveal, the political-military leaderships of these countries have very little external or domestic reasons to minimise enemy civilian casualties through fundamental-revolutionary change in their conduct of war. In other words, this research finds that employment of NLW have failed because the political leadership asks the militaries to reduce the enemy civilian casualties to a politically acceptable level, rather than to the technologically possible minimum; as in the socio-cultural-political context of each country, support for the former appears to be significantly higher than for the latter.
Resumo:
The protective shielding design of a mammography facility requires the knowledge of the scattered radiation by the patient and image receptor components. The shape and intensity of secondary x-ray beams depend on the kVp applied to the x-ray tube, target/filter combination, primary x-ray field size, and scattering angle. Currently, shielding calculations for mammography facilities are performed based on scatter fraction data for Mo/Mo target/filter, even though modern mammography equipment is designed with different anode/filter combinations. In this work we present scatter fraction data evaluated based on the x-ray spectra produced by a Mo/Mo, Mo/Rh and W/Rh target/filter, for 25, 30 and 35 kV tube voltages and scattering angles between 30 and 165 degrees. Three mammography phantoms were irradiated and the scattered radiation was measured with a CdZnTe detector. The primary x-ray spectra were computed with a semiempirical model based on the air kerma and HVL measured with an ionization chamber. The results point out that the scatter fraction values are higher for W/Rh than for Mo/Mo and Mo/Rh, although the primary and scattered air kerma are lower for W/Rh than for Mo/Mo and Mo/Rh target/filter combinations. The scatter fractions computed in this work were applied in a shielding design calculation in order to evaluate shielding requirements for each of these target/filter combinations. Besides, shielding requirements have been evaluated converting the scattered air kerma from mGy/week to mSv/week adopting initially a conversion coefficient from air kerma to effective dose as 1 Sv/Gy and then a mean conversion coefficient specific for the x-ray beam considered. Results show that the thickest barrier should be provided for Mo/Mo target/filter combination. They also point out that the use of the conversion coefficient from air kerma to effective dose as 1 Sv/Gy is conservatively high in the mammography energy range and overestimate the barrier thickness. (c) 2008 American Association of Physicists in Medicine.
Resumo:
Mobile learning involves use of mobile devices to participate in learning activities. Most elearning activities are available to participants through learning systems such as learning content management systems (LCMS). Due to certain challenges, LCMS are not equally accessible on all mobile devices. This study investigates actual use, perceived usefulness and user experiences of LCMS use on mobile phones at Makerere University in Uganda. The study identifies challenges pertaining to use and discusses how to improve LCMS use on mobile phones. Such solutions are a cornerstone in enabling and improving mobile learning. Data was collected by means of focus group discussions, an online survey designed based on the Technology Acceptance Model (TAM), and LCMS log files of user activities. Data was collected from two courses where Moodle was used as a learning platform. The results indicate positive attitudes towards use of LCMS on phones but also huge challenges whichare content related and technical in nature.