981 resultados para 489
Resumo:
Booth, Ken, Theory of World Security (Cambridge: Cambridge University Press, 2008), pp.xviii+489 RAE2008
Design and implementation of the embedded capacitance layers for decoupling of wireless sensor nodes
Resumo:
In this paper, the embedded capacitance material (ECM) is fabricated between the power and ground layers of the wireless sensor nodes, forming an integrated capacitance to replace the large amount of decoupling capacitors on the board. The ECM material, whose dielectric constant is 16, has the same size of the wireless sensor nodes of 3cm*3cm, with a thickness of only 14μm. Though the capacitance of a single ECM layer being only around 8nF, there are two reasons the ECM layers can still replace the high frequency decoupling capacitors (100nF in our case) on the board. The first reason is: the parasitic inductance of the ECM layer is much lower than the surface mount capacitors'. A smaller capacitance value of the ECM layer could achieve the same resonant frequency of the surface mount decoupling capacitors. Simulation and measurement fit this assumption well. The second reason is: more than one layer of ECM material are utilized during the design step to get a parallel connection of the several ECM capacitance layers, finally leading to a larger value of the capacitance and smaller value of parasitic. Characterization of the ECM is carried out by the LCR meter. To evaluate the behaviors of the ECM layer, time and frequency domain measurements are performed on the power-bus decoupling of the wireless sensor nodes. Comparison with the measurements of bare PCB board and decoupling capacitors solution are provided to show the improvement of the ECM layer. Measurements show that the implementation of the ECM layer can not only save the space of the surface mount decoupling capacitors, but also provide better power-bus decoupling to the nodes.
Resumo:
To identify patients at increased risk of cardiovascular (CV) outcomes, apparent treatment-resistant hypertension (aTRH) is defined as having a blood pressure above goal despite the use of 3 or more antihypertensive therapies of different classes at maximally tolerated doses, ideally including a diuretic. Recent epidemiologic studies in selected populations estimated the prevalence of aTRH as 10% to 15% among patients with hypertension and that aTRH is associated with elevated risk of CV and renal outcomes. Additionally, aTRH and CKD are associated. Although the pathogenesis of aTRH is multifactorial, the kidney is believed to play a significant role. Increased volume expansion, aldosterone concentration, mineralocorticoid receptor activity, arterial stiffness, and sympathetic nervous system activity are central to the pathogenesis of aTRH and are targets of therapies. Although diuretics form the basis of therapy in aTRH, pathophysiologic and clinical data suggest an important role for aldosterone antagonism. Interventional techniques, such as renal denervation and carotid baroreceptor activation, modulate the sympathetic nervous system and are currently in phase III trials for the treatment of aTRH. These technologies are as yet unproven and have not been investigated in relationship to CV outcomes or in patients with CKD.
Resumo:
© 2014, The International Biometric Society.A potential venue to improve healthcare efficiency is to effectively tailor individualized treatment strategies by incorporating patient level predictor information such as environmental exposure, biological, and genetic marker measurements. Many useful statistical methods for deriving individualized treatment rules (ITR) have become available in recent years. Prior to adopting any ITR in clinical practice, it is crucial to evaluate its value in improving patient outcomes. Existing methods for quantifying such values mainly consider either a single marker or semi-parametric methods that are subject to bias under model misspecification. In this article, we consider a general setting with multiple markers and propose a two-step robust method to derive ITRs and evaluate their values. We also propose procedures for comparing different ITRs, which can be used to quantify the incremental value of new markers in improving treatment selection. While working models are used in step I to approximate optimal ITRs, we add a layer of calibration to guard against model misspecification and further assess the value of the ITR non-parametrically, which ensures the validity of the inference. To account for the sampling variability of the estimated rules and their corresponding values, we propose a resampling procedure to provide valid confidence intervals for the value functions as well as for the incremental value of new markers for treatment selection. Our proposals are examined through extensive simulation studies and illustrated with the data from a clinical trial that studies the effects of two drug combinations on HIV-1 infected patients.
Resumo:
La presente investigación, de orden cualitativo y en curso, forma parte de una tesis de maestría en México respaldada por el CONACYT. El interés es identificar las dificultades de estudiantes del segundo ciclo de primaria (8-9 años) al resolver problemas multiplicativos según la estructura propuesta por Vergnaud (1995) en el “Isomorfismo de Medidas”. La propuesta teórica es basada en el “Modelo Teórico Local” (Filloy, 1999). En su primera fase, de dos, se realiza la revisión de la propuesta institucional (Secretaria de Educación Pública, [SEP] 1993), bibliografía complementaria respecto a la enseñanza de problemas multiplicativos, y el diseño de pruebas y ejercicios de diagnóstico. Como resultados preliminares, se tiene que los niños muestran modos de resolución de problemas deficientes, debido a que en la propuesta oficial no se tratan problemas relacionados con el “Isomorfismo de medidas”. Los niños presentan dificultades al resolver problemas de la vida cotidiana planteados en el aula.
Resumo:
This paper describes the architecture of the knowledge based system (KBS) component of Smartfire, a fire field modelling tool for use by members of the fire safety engineering community who are not expert in modelling techniques. The KBS captures the qualitative reasoning of an experienced modeller in the assessment of room geometries, so as to set up the important initial parameters of the problem. Fire modelling expertise is an example of geometric and spatial reasoning, which raises representational problems. The approach taken in this project is a qualitative representation of geometric room information based on Forbus’ concept of a metric diagram. This takes the form of a coarse grid, partitioning the domain in each of the three spatial dimensions. Inference over the representation is performed using a case-based reasoning (CBR) component. The CBR component stores example partitions with key set-up parameters; this paper concentrates on the key parameter of grid cell distribution.
Resumo:
The PHYSICA software was developed to enable multiphysics modelling allowing for interaction between Computational Fluid Dynamics (CFD) and Computational Solid Mechanics (CSM) and Computational Aeroacoustics (CAA). PHYSICA uses the finite volume method with 3-D unstructured meshes to enable the modelling of complex geometries. Many engineering applications involve significant computational time which needs to be reduced by means of a faster solution method or parallel and high performance algorithms. It is well known that multigrid methods serve as a fast iterative scheme for linear and nonlinear diffusion problems. This papers attempts to address two major issues of this iterative solver, including parallelisation of multigrid methods and their applications to time dependent multiscale problems.
Resumo:
The North Sea is a dynamic large marine ecosystem which is bordered by a dense coastal population, contains a productive oil and gas province, has a dense shipping network and has one of the most productive fisheries in the world. An assessment of the state of health of the North Sea was initiated in 1987 as part of a developing series of international initiatives at Ministerial level to address concerns over the impact of these activities on the marine ecosystem. Four North Sea Ministerial Conferences (1984, 1987, 1990, 1995) and an Intermediate Ministerial Meeting (1993) have been held to date to develop a harmonized approach to the sustainable management of the North Sea. In 1988 at the request of Ministers a North Sea Task Force was established to co-ordinate work leading to the production of a Quality Status Report (QSR) on the North Sea in December 1993. In recognition of the large geographical and ecological diversity exhibited, a sub-regional approach was adopted and a total of 13 sub-regional assessment reports were produced to a common protocol. The Task Force established a five-year plan to co-ordinate research, monitoring and modelling and other special topics in the preparations for the QSR. As part of this exercise a ‘Monitoring Master Plan’ was drawn up to provide for the first time reliable spatial information on the distribution of chemical contaminants and biological effects throughout the North Sea. The Task Force was a unique structure in international collaboration with a fixed remit that ended in December 1993. It was successful in bringing together many diverse organisations with interests in the North Sea and co-ordinated to a tight timetable the production of the QSR. The experiences gained are now being applied to the whole north east Atlantic under a new OSPAR Convention and have wide application to other Large Marine Ecosystems.