989 resultados para Critical exponents
Resumo:
Glycogen storage disease type-Ia (GSD-Ia) patients deficient in glucose-6-phosphatase-α (G6Pase-α or G6PC) manifest impaired glucose homeostasis characterized by fasting hypoglycemia, growth retardation, hepatomegaly, nephromegaly, hyperlipidemia, hyperuricemia, and lactic acidemia. Two efficacious recombinant adeno-associated virus pseudotype 2/8 (rAAV8) vectors expressing human G6Pase-α have been independently developed. One is a single-stranded vector containing a 2864-bp of the G6PC promoter/enhancer (rAAV8-GPE) and the other is a double-stranded vector containing a shorter 382-bp minimal G6PC promoter/enhancer (rAAV8-miGPE). To identify the best construct, a direct comparison of the rAAV8-GPE and the rAAV8-miGPE vectors was initiated to determine the best vector to take forward into clinical trials. We show that the rAAV8-GPE vector directed significantly higher levels of hepatic G6Pase-α expression, achieved greater reduction in hepatic glycogen accumulation, and led to a better toleration of fasting in GSD-Ia mice than the rAAV8-miGPE vector. Our results indicated that additional control elements in the rAAV8-GPE vector outweigh the gains from the double-stranded rAAV8-miGPE transduction efficiency, and that the rAAV8-GPE vector is the current choice for clinical translation in human GSD-Ia.
Resumo:
Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.
Resumo:
The second round of the community-wide initiative Critical Assessment of automated Structure Determination of Proteins by NMR (CASD-NMR-2013) comprised ten blind target datasets, consisting of unprocessed spectral data, assigned chemical shift lists and unassigned NOESY peak and RDC lists, that were made available in both curated (i.e. manually refined) or un-curated (i.e. automatically generated) form. Ten structure calculation programs, using fully automated protocols only, generated a total of 164 three-dimensional structures (entries) for the ten targets, sometimes using both curated and un-curated lists to generate multiple entries for a single target. The accuracy of the entries could be established by comparing them to the corresponding manually solved structure of each target, which was not available at the time the data were provided. Across the entire data set, 71 % of all entries submitted achieved an accuracy relative to the reference NMR structure better than 1.5 Å. Methods based on NOESY peak lists achieved even better results with up to 100 % of the entries within the 1.5 Å threshold for some programs. However, some methods did not converge for some targets using un-curated NOESY peak lists. Over 90 % of the entries achieved an accuracy better than the more relaxed threshold of 2.5 Å that was used in the previous CASD-NMR-2010 round. Comparisons between entries generated with un-curated versus curated peaks show only marginal improvements for the latter in those cases where both calculations converged.
Resumo:
In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.
Resumo:
A review of polymer cure models used in microelectronics packaging applications reveals no clear consensus of the chemical rate constants for the cure reactions, or even of an effective model. The problem lies in the contrast between the actual cure process, which involves a sequence of distinct chemical reactions, and the models, which typically assume only one, (or two with some restrictions on the independence of their characteristic constants.) The standard techniques to determine the model parameters are based on differential scanning calorimetry (DSC), which cannot distinguish between the reactions, and hence yields results useful only under the same conditions, which completely misses the point of modeling. The obvious solution is for manufacturers to provide the modeling parameters, but failing that, an alternative experimental technique is required to determine individual reaction parameters, e.g. Fourier transform infra-red spectroscopy (FTIR).
Resumo:
This, the second edition, adopts a critical and theoretical perspective on remuneration policy and practices in the UK, from the decline of collective bargaining to the rise of more individualistic systems based on employee performance. It tackles the conceptual issues missing from existing texts in the field of HRM by critically examining the latest academic literature on the topic. [Taken from publisher's product description].
Resumo:
The U&I programme Critical Friends (CFs) are developing guidelines on the role of the Critical Friend and the way in which this links with U&I programme model, projects and outputs. The critical friends are also in the process of building a new online community of shared effective practice for current and future critical friends. The CF Benefits Realisation project aims to synthesise existing CF U&I, JISC Curriculum Design and Delivery, JISC Institutional Innovation and related programmes, activities, methodologies and approaches to produce a range of specialist guidelines and other outputs for effective CF practice, within the context of the aims and objectives of the JISC U&I programme. We aim to disseminate these to a wide range of interests within the JISC HE-FE communities, following consultation.
Resumo:
The drug calculation skill of nurses continues to be a national concern. The continued concern has led to the introduction of mandatory drug calculation skills tests which students must pass in order to go on to the nursing register. However, there is little evidence to demonstrate that nurses are poor at solving drug calculation in practice. This paper argues that nurse educationalists have inadvertently created a problem that arguably does not exist in practice through use of invalid written drug assessment tests and have introduced their own pedagogical practice of solving written drug calculations. This paper will draw on literature across mathematics, philosophy, psychology and nurse education to demonstrate why written drug assessments are invalid, why learning must take place predominantly in the clinical area and why the key focus on numeracy and formal mathematical skills as essential knowledge for nurses is potentially unnecessary.
Resumo:
Environmental science is often described as an interdisciplinary subject, but one firmly grounded in positivist science. Less well recognized is the idea that interdisciplinarity actually challenges fundamental conceptions concerning how reality is understood from an orthodox science perspective. Drawing on recent non-dualism (or post-natural) literature, it is suggested in this paper that there is a need for greater awareness and debate concerning the underlying challenges that ideas of interdisciplinarity and holism present for environmental science. It is argued by aligning environmental science more strongly with non-dualistic traditions (spanning the sciences, arts and religion), fundamental issues are raised concerning how reality is understood and what constitutes valid research methodologies. The concept of intrinsic value is used as one example of the way non-dualistic theory can open up new territories for exploring reality.
Resumo:
Many women who have higher–risk pregnancies, complications or medical conditions require specialist obstetric or multidisciplinary care. Increasingly women, whose condition deteriorates and becomes critical during childbirth, are being cared for by midwives in obstetric high dependency units within the labour ward, rather than being cared for by nurses in ITU. Critical Care in Childbearing for Midwives explores all aspects of management, support and care of childbearing women who become critically ill due to pre–existing conditions or who develop critical illness as a result of complications of childbearing. It examines predisposing factors which result in the need for critical care, addresses specialist monitoring technology and skills, and explores autonomous practice and team approaches to providing care for critically ill women in childbearing.
Resumo:
In the past 15 years in the UK, the state has acquired powers, which mark a qualitative shift in its relationship to higher education. Since the introduction and implementation of the Further and Higher Education Act 1992, the Teaching and Higher Education Act 1998 and the Higher Education Act 2004, a whole raft of changes have occurred which include the following: Widening participation; the development of interdisciplinary, experiential and workplace-based learning focused on a theory-practice dialogue; quality assurance; and new funding models which encompass public and private partnerships. The transformation of higher education can be placed in the context of New Labour’s overall strategies for overarching reform of public services, as set out in the Prime Minister’s Strategy Unit’s discussion paper The UK Government’s Approach to Public Service Reform (2006). An optimistic view of changes to higher education is that they simultaneously obey democratic and economic imperatives. There is an avowed commitment through the widening participation agenda to social inclusion and citizenship, and to providing the changing skills base necessary for the global economy. A more cynical view is that, when put under critical scrutiny, as well as being emancipatory, in some senses these changes can be seen to mobilise regulatory and disciplinary practices. This paper reflects on what kinds of teaching and learning are promoted by the new relationship between the state and the university. It argues that, whilst governmental directives for innovations and transformations in teaching and learning allegedly empower students and put their interests at the centre, reforms can also be seen to consist of supervisory and controlling mechanisms with regard both to our own practices as teachers and the knowledge/ learning we provide for the students.
Resumo:
Deliberating on Enterprise Resource Planning (ERP) software sourcing and provision, this paper contrasts the corporate environment with the small business environment. The paper is about Enterprise Resource Planning client (ERPc) expectations and Enterprise Resource Planning vendor (ERPv) value propositions as a mutually compatible process for achieving acceptable standards of ERP software performance. It is suggested that a less-than-equitable vendor–client relationship would not contribute to the implementation of the optimum solution. Adapting selected theoretical concepts and models, the researchers analyse ERPv to ERPc relationship. This analysis is designed to discover if the provision of the very large ERP vendors who market systems such as SAP, and the provision of the smaller ERP vendors (in this instance Eshbel Technologies Ltd who market an ERP software solution called Priority) when framed as a value proposition (Walters, D. (2002) Operations Strategy. Hampshire, UK: Palgrave), is at all comparable or distinctive.