948 resultados para Set of dimensions of fractality
Resumo:
This paper develops some theoretical and methodological considerations for the development of a critical competence model (CCM). The model is defined as a set of skills and knowledge functionally organized allowing measurable results with positive consequences for the strategic business objectives. The theoretical approaches of classical model of competences, the contemporary model of competencies and human competencies model were revised for the proposal development. implementation of the model includes 5 steps: 1) conducting a job analysis considering which dimensions or facets are subject to revision, 2) identify people with the opposite performance (the higher performance and lower performance); 3) identify critical incidents most relevant to the job position, 4) develop behavioral expectation scales (bes) and 5) validate BES obtained for experts in the field. As a final consideration, is determined that the competence models require accurate measurement. Approaches considering excessive theoreticism may cause the issue of competence become a fashion business with low or minimal impact, affecting its validity, reliability and deployment in organizations.
Resumo:
The community pharmacy service medicines use review (MUR) was introduced in 2005 ‘to improve patient knowledge, concordance and use of medicines’ through a private patient–pharmacist consultation. The MUR presents a fundamental change in community pharmacy service provision. While traditionally pharmacists are dispensers of medicines and providers of medicines advice, and patients as recipients, the MUR considers pharmacists providing consultation-type activities and patients as active participants. The MUR facilitates a two-way discussion about medicines use. Traditional patient–pharmacist behaviours transform into a new set of behaviours involving the booking of appointments, consultation processes and form completion, and the physical environment of the patient–pharmacist interaction moves from the traditional setting of the dispensary and medicines counter to a private consultation room. Thus, the new service challenges traditional identities and behaviours of the patient and the pharmacist as well as the environment in which the interaction takes place. In 2008, the UK government concluded there is at present too much emphasis on the quantity of MURs rather than on their quality.[1] A number of plans to remedy the perceived imbalance included a suggestion to reward ‘health outcomes’ achieved, with calls for a more focussed and scientific approach to the evaluation of pharmacy services using outcomes research. Specifically, the UK government set out the main principal research areas for the evaluation of pharmacy services to include ‘patient and public perceptions and satisfaction’as well as ‘impact on care and outcomes’. A limited number of ‘patient satisfaction with pharmacy services’ type questionnaires are available, of varying quality, measuring dimensions relating to pharmacists’ technical competence, behavioural impressions and general satisfaction. For example, an often cited paper by Larson[2] uses two factors to measure satisfaction, namely ‘friendly explanation’ and ‘managing therapy’; the factors are highly interrelated and the questions somewhat awkwardly phrased, but more importantly, we believe the questionnaire excludes some specific domains unique to the MUR. By conducting patient interviews with recent MUR recipients, we have been working to identify relevant concepts and develop a conceptual framework to inform item development for a Patient Reported Outcome Measure questionnaire bespoke to the MUR. We note with interest the recent launch of a multidisciplinary audit template by the Royal Pharmaceutical Society of Great Britain (RPSGB) in an attempt to review the effectiveness of MURs and improve their quality.[3] This template includes an MUR ‘patient survey’. We will discuss this ‘patient survey’ in light of our work and existing patient satisfaction with pharmacy questionnaires, outlining a new conceptual framework as a basis for measuring patient satisfaction with the MUR. Ethical approval for the study was obtained from the NHS Surrey Research Ethics Committee on 2 June 2008. References 1. Department of Health (2008). Pharmacy in England: Building on Strengths – Delivering the Future. London: HMSO. www. official-documents.gov.uk/document/cm73/7341/7341.pdf (accessed 29 September 2009). 2. Larson LN et al. Patient satisfaction with pharmaceutical care: update of a validated instrument. JAmPharmAssoc 2002; 42: 44–50. 3. Royal Pharmaceutical Society of Great Britain (2009). Pharmacy Medicines Use Review – Patient Audit. London: RPSGB. http:// qi4pd.org.uk/index.php/Medicines-Use-Review-Patient-Audit. html (accessed 29 September 2009).
Resumo:
The paper reports a study of children's attitudes to school based on a questionnaire survey of 845 pupils in their first year of secondary school in England, together with interviews with a sample of the children. A clearly structured set of attitudes emerged from a factor analysis which showed a distinction between instrumental and affective aspects of attitudes but also dimensions within these, including a sense of teacher commitment and school as a difficult environment. Virtually all children had a strong sense of the importance of doing well at school. However, a substantial minority were not sure that they would stay on after 16. There were few differences between boys and girls or between children from different socio-economic backgrounds but children planning to leave at 16 enjoyed school less and were less sure that it had anything to offer them. There was an almost universal commitment to the value of education but, for a minority, an ambivalence about the experience and relevance of schooling for them.
Resumo:
Thirty one new sodium heterosulfamates, RNHSO3Na, where the R portion contains mainly thiazole, benzothiazole, thiadiazole and pyridine ring structures, have been synthesized and their taste portfolios have been assessed. A database of 132 heterosulfamates ( both open-chain and cyclic) has been formed by combining these new compounds with an existing set of 101 heterosulfamates which were previously synthesized and for which taste data are available. Simple descriptors have been obtained using (i) measurements with Corey-Pauling-Koltun (CPK) space- filling models giving x, y and z dimensions and a volume VCPK, (ii) calculated first order molecular connectivities ((1)chi(v)) and (iii) the calculated Spartan program parameters to obtain HOMO, LUMO energies, the solvation energy E-solv and V-SPART AN. The techniques of linear (LDA) and quadratic (QDA) discriminant analysis and Tree analysis have then been employed to develop structure-taste relationships (SARs) that classify the sweet (S) and non-sweet (N) compounds into separate categories. In the LDA analysis 70% of the compounds were correctly classified ( this compares with 65% when the smaller data set of 101 compounds was used) and in the QDA analysis 68% were correctly classified ( compared to 80% previously). TheTree analysis correctly classified 81% ( compared to 86% previously). An alternative Tree analysis derived using the Cerius2 program and a set of physicochemical descriptors correctly classified only 54% of the compounds.
Resumo:
Hybrid vigour may help overcome the negative effects of climate change in rice. A popular rice hybrid (IR75217H), a heat-tolerant check (N22), and a mega-variety (IR64) were tested for tolerance of seed-set and grain quality to high-temperature stress at anthesis at ambient and elevated [CO2]. Under an ambient air temperature of 29 °C (tissue temperature 28.3 °C), elevated [CO2] increased vegetative and reproductive growth, including seed yield in all three genotypes. Seed-set was reduced by high temperature in all three genotypes, with the hybrid and IR64 equally affected and twice as sensitive as the tolerant cultivar N22. No interaction occurred between temperature and [CO2] for seed-set. The hybrid had significantly more anthesed spikelets at all temperatures than IR64 and at 29 °C this resulted in a large yield advantage. At 35 °C (tissue temperature 32.9 °C) the hybrid had a higher seed yield than IR64 due to the higher spikelet number, but at 38 °C (tissue temperature 34–35 °C) there was no yield advantage. Grain gel consistency in the hybrid and IR64 was reduced by high temperatures only at elevated [CO2], while the percentage of broken grains increased from 10% at 29 °C to 35% at 38 °C in the hybrid. It is concluded that seed-set of hybrids is susceptible to short episodes of high temperature during anthesis, but that at intermediate tissue temperatures of 32.9 °C higher spikelet number (yield potential) of the hybrid can compensate to some extent. If the heat tolerance from N22 or other tolerant donors could be transferred into hybrids, yield could be maintained under the higher temperatures predicted with climate change.
Resumo:
Neural field models of firing rate activity typically take the form of integral equations with space-dependent axonal delays. Under natural assumptions on the synaptic connectivity we show how one can derive an equivalent partial differential equation (PDE) model that properly treats the axonal delay terms of the integral formulation. Our analysis avoids the so-called long-wavelength approximation that has previously been used to formulate PDE models for neural activity in two spatial dimensions. Direct numerical simulations of this PDE model show instabilities of the homogeneous steady state that are in full agreement with a Turing instability analysis of the original integral model. We discuss the benefits of such a local model and its usefulness in modeling electrocortical activity. In particular, we are able to treat “patchy” connections, whereby a homogeneous and isotropic system is modulated in a spatially periodic fashion. In this case the emergence of a “lattice-directed” traveling wave predicted by a linear instability analysis is confirmed by the numerical simulation of an appropriate set of coupled PDEs.
Resumo:
This Themed Section aims to increase understanding of how the idea of climate change, and the policies and actions that spring from it, travel beyond their origins in natural sciences to meet different political arenas in the developing world. It takes a discursive approach whereby climate change is not just a set of physical processes but also a series of messages, narratives and policy prescriptions. The articles are mostly case study-based and focus on sub-Saharan Africa and Small Island Developing States (SIDS). They are organised around three interlinked themes. The first theme concerns the processes of rapid technicalisation and professionalisation of the climate change ‘industry’, which have sustantially narrowed the boundaries of what can be viewed as a legitimate social response to the problem of global warming. The second theme deals with the ideological effects of the climate change industry, which is ‘depoliticisation’, in this case the deflection of attention away from underlying political conditions of vulnerability and exploitation towards the nature of the physical hazard itself. The third theme concerns the institutional effects of an insufficiently socialised idea of climate change, which is the maintenance of existing relations of power or their reconfiguration in favour of the already powerful. Overall, the articles suggest that greater scrutiny of the discursive and political dimensions of mitigation and adaptation activities is required. In particular, greater attention should be directed towards the policy consequences that governments and donors construct as a result of their framing and rendition of climate change issues.
Resumo:
There has been a recent rejuvenation of interest in studies of motivation-cognition interactions arising from many different areas of psychology and neuroscience. The current issue of Cognitive, Affective, and Behavioral Neuroscience provides a sampling of some of the latest research from a number of these different areas. In this introductory paper, we provide an overview of the current state of the field, in terms of key research developments and candidate neural mechanisms receiving focused investigation as potential sources of motivation-cognition interaction. However, our primary goal is conceptual: to highlight the distinct perspectives taken by different research areas in terms of how motivation is defined, the relevant dimensions and dissociations that are emphasized, and the theoretical questions being targeted. Together, these distinctions present both challenges and opportunities for efforts aiming towards a more unified and cross-disciplinary approach. We identify a set of pressing research questions calling out for this sort of cross-disciplinary approach, with the explicit goal of encouraging integrative and collaborative investigations directed towards them.
Resumo:
The overall global-scale consequences of climate change are dependent on the distribution of impacts across regions, and there are multiple dimensions to these impacts.This paper presents a global assessment of the potential impacts of climate change across several sectors, using a harmonised set of impacts models forced by the same climate and socio-economic scenarios. Indicators of impact cover the water resources, river and coastal flooding, agriculture, natural environment and built environment sectors. Impacts are assessed under four SRES socio-economic and emissions scenarios, and the effects of uncertainty in the projected pattern of climate change are incorporated by constructing climate scenarios from 21 global climate models. There is considerable uncertainty in projected regional impacts across the climate model scenarios, and coherent assessments of impacts across sectors and regions therefore must be based on each model pattern separately; using ensemble means, for example, reduces variability between sectors and indicators. An example narrative assessment is presented in the paper. Under this narrative approximately 1 billion people would be exposed to increased water resources stress, around 450 million people exposed to increased river flooding, and 1.3 million extra people would be flooded in coastal floods each year. Crop productivity would fall in most regions, and residential energy demands would be reduced in most regions because reduced heating demands would offset higher cooling demands. Most of the global impacts on water stress and flooding would be in Asia, but the proportional impacts in the Middle East North Africa region would be larger. By 2050 there are emerging differences in impact between different emissions and socio-economic scenarios even though the changes in temperature and sea level are similar, and these differences are greater in 2080. However, for all the indicators, the range in projected impacts between different climate models is considerably greater than the range between emissions and socio-economic scenarios.
Resumo:
Extending our previous work `Fields on the Poincare group and quantum description of orientable objects` (Gitman and Shelepin 2009 Eur. Phys. J. C 61 111-39), we consider here a classification of orientable relativistic quantum objects in 3 + 1 dimensions. In such a classification, one uses a maximal set of ten commuting operators (generators of left and right transformations) in the space of functions on the Poincare group. In addition to the usual six quantum numbers related to external symmetries (given by left generators), there appear additional quantum numbers related to internal symmetries (given by right generators). Spectra of internal and external symmetry operators are interrelated, which, however, does not contradict the Coleman-Mandula no-go theorem. We believe that the proposed approach can be useful for the description of elementary spinning particles considered as orientable objects. In particular, it gives a group-theoretical interpretation of some facts of the existing phenomenological classification of spinning particles.
Resumo:
We propose an approach to the quantum-mechanical description of relativistic orientable objects. It generalizes Wigner`s ideas concerning the treatment of nonrelativistic orientable objects (in particular, a nonrelativistic rotator) with the help of two reference frames (space-fixed and body-fixed). A technical realization of this generalization (for instance, in 3+1 dimensions) amounts to introducing wave functions that depend on elements of the Poincar, group G. A complete set of transformations that test the symmetries of an orientable object and of the embedding space belongs to the group I =GxG. All such transformations can be studied by considering a generalized regular representation of G in the space of scalar functions on the group, f(x,z), that depend on the Minkowski space points xaG/Spin(3,1) as well as on the orientation variables given by the elements z of a matrix ZaSpin(3,1). In particular, the field f(x,z) is a generating function of the usual spin-tensor multi-component fields. In the theory under consideration, there are four different types of spinors, and an orientable object is characterized by ten quantum numbers. We study the corresponding relativistic wave equations and their symmetry properties.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
The value of life methodology has been recently applied to a wide range of contexts as a means to evaluate welfare gains attributable to mortality reductions and health improvements. Yet, it suffers from an important methodological drawback: it does not incorporate into the analysis child mortality, individuals’ decisions regarding fertility, and their altruism towards offspring. Two interrelated dimensions of fertility choice are potentially essential in evaluating life expectancy and health related gains. First, child mortality rates can be very important in determining welfare in a context where individuals choose the number of children they have. Second, if altruism motivates fertility, life expectancy gains at any point in life have a twofold effect: they directly increase utility via increased survival probabilities, and they increase utility via increased welfare of the offspring. We develop a manageable way to deal with value of life valuations when fertility choices are endogenous and individuals are altruistic towards their offspring. We use the methodology developed in the paper to value the reductions in mortality rates experienced by the US between 1965 and 1995. The calculations show that, with a very conservative set of parameters, altruism and fertility can easily double the value of mortality reductions for a young adult, when compared to results obtained using the traditional value of life methodology.
Resumo:
Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.
Resumo:
The goal of this work is to assess the efficacy of texture measures for estimating levels of crowd densities ill images. This estimation is crucial for the problem of crowd monitoring. and control. The assessment is carried out oil a set of nearly 300 real images captured from Liverpool Street Train Station. London, UK using texture measures extracted from the images through the following four different methods: gray level dependence matrices, straight lille segments. Fourier analysis. and fractal dimensions. The estimations of dowel densities are given in terms of the classification of the input images ill five classes of densities (very low, low. moderate. high and very high). Three types of classifiers are used: neural (implemented according to the Kohonen model). Bayesian. and an approach based on fitting functions. The results obtained by these three classifiers. using the four texture measures. allowed the conclusion that, for the problem of crowd density estimation. texture analysis is very effective.