906 resultados para Quantities and measurements
Resumo:
We review the role of strong electronic correlations in quasi-two-dimensional organic charge transfer salts such as (BEDT-TTF)(2)X, (BETS)(2)Y, and beta'-[Pd(dmit)(2)](2)Z. We begin by defining minimal models for these materials. It is necessary to identify two classes of material: the first class is strongly dimerized and is described by a half-filled Hubbard model; the second class is not strongly dimerized and is described by a quarter-filled extended Hubbard model. We argue that these models capture the essential physics of these materials. We explore the phase diagram of the half-filled quasi-two-dimensional organic charge transfer salts, focusing on the metallic and superconducting phases. We review work showing that the metallic phase, which has both Fermi liquid and 'bad metal' regimes, is described both quantitatively and qualitatively by dynamical mean field theory (DMFT). The phenomenology of the superconducting state is still a matter of contention. We critically review the experimental situation, focusing on the key experimental results that may distinguish between rival theories of superconductivity, particularly probes of the pairing symmetry and measurements of the superfluid stiffness. We then discuss some strongly correlated theories of superconductivity, in particular the resonating valence bond (RVB) theory of superconductivity. We conclude by discussing some of the major challenges currently facing the field. These include parameterizing minimal models, the evidence for a pseudogap from nuclear magnetic resonance (NMR) experiments, superconductors with low critical temperatures and extremely small superfluid stiffnesses, the possible spin- liquid states in kappa-(ET)(2)Cu-2(CN)(3) and beta'-[Pd(dmit)(2)](2)Z, and the need for high quality large single crystals.
Resumo:
Simple design formulas for designing ultra wideband (UWB) antennas in the form of complementary planar monopoles are described and their validity is tested using full electromagnetic wave simulations and measurements. Assuming dielectric substrate with relative permittivity of 10.2, the designed antennas feature a small size of 13 mmtimes26 mm. They exhibit a 10 dB return loss bandwidth from 3 to more than 15 GHz accompanied by near omnidirectional characteristics and good radiation efficiency throughout this band
Resumo:
Esta dissertação apresenta os principais aspectos da Teoria dos Jogos, mostrando sua aplicação como instrumento analítico na Gestão de Pessoas no que diz respeito à variável salário. Considera a organização e o trabalhador como conceitos gerais, sem identificar o setor de atuação, ramo de atividade, classificação jurídica em função do seu faturamento, total de empregados ou participação de mercado dessa organização. Da mesma forma o conceito trabalhador não recebe qualquer identificação em relação ao setor de atividade onde trabalha, função, salário ou formação profissional. A organização é toda estrutura que gera bens e serviços para a sociedade e o trabalhador é todo elemento que emprega sua força de trabalho na produção de bens e serviços. Os objetivos estabelecidos para este estudo são: identificar as possibilidades de aplicação da Teoria dos Jogos na Gestão de Pessoas considerando a variável salário como elemento de conflito entre a organização e o trabalhador; mostrar se a forma de representação extensiva é mais apropriada ou não para analisar o cenário de embate na decisão de contratar ou não o trabalhador ou pagar mais ou menos salário e a existência do Equilíbrio de Nash. A metodologia qualitativa com apoio bibliográfico e documental caracteriza esta pesquisa qualitativa quanto a metodologia de pesquisa. Os métodos qualitativos contribuem para interpretar fenômenos do cotidiano, podendo ser composto por dados simbólicos situados em determinado contexto. A pesquisa documental é uma contribuição importante ao estudo do tema proposto, já que a pesquisa qualitativa não é uma proposta rigidamente estruturada e isto permite que o pesquisador use a imaginação e criatividade para atingir o objetivo. Os resultados obtidos pela pesquisa dão conta de que é possível a aplicação da Teoria dos Jogos na Gestão de Pessoas considerando o embate entre os jogadores (o trabalhador e a organização) em torno do salário, conforme pode ser visto no capítulo 4 nas representações da matriz de payoff de um jogo estratégico e nas figuras 9,10,11,e 16. A representação na forma extensiva, constitui outro objetivo, indicando os payoffs entre duas decisões centrais representadas por X = flexibilização com renúncia dos direitos pelos trabalhadores e Y = flexibilização/adaptação/negociação, conforme figura 16. Ao analisar a figura, o gestor de pessoas percebe as estratégias existentes para a organização e trabalhador para a tomada de decisão, ao mesmo tempo em que pode avaliar a situação que esteja vivendo e fazer simulações em busca de novas propostas. Por fim, o Equilíbrio de Nash para a aplicação na Gestão de Pessoas é discutido no item 4.1.3, sendo possível verificar que tanto o trabalhador como a organização podem chegar a uma decisão favorável para ambos e manter seus objetivos pretendidos inicialmente. Na figura 17, esse equilíbrio é apresentado depois da tomada de decisão do trabalhador pela proposta feita pela organização na sequência O2 e o trabalhador ficou com o ramo de sequência T2 com o valor de 20 moedas. A potencialidade da Teoria dos Jogos na Gestão de Pessoas surge do fato de que quem atua em uma organização compartilha resultados bons ou ruins obtidos pelas escolhas alheias, escolhas individuais e pelas escolhas construídas coletivamente. Quando o trabalhador resolve produzir menos, a empresa sofre com a perda do lucro gerado pelo ritmo mais lento de trabalho. Para mudar esse quadro, a empresa toma a decisão de aumentar o salário e o trabalhador por sua vez desenvolve a tarefa com maior velocidade e em maior quantidade e ela pode retomar o seu lucro. Nesses jogos há cobranças de desempenho, exigência para atingir metas, pressões, conflitos com clientes e lideranças. Logo, a Teoria dos Jogos pode ser aplicada como instrumento para o gestor de Pessoas avaliar a situação vivida para a tomada de decisão que resolva a situação de embate.
Resumo:
O objetivo deste estudo foi avaliar as possíveis alterações das características horizontais, verticais, de simetria e do arco do sorriso de pacientes com atresia maxilar submetidos à expansão rápida da maxila. A amostra consistiu de 81 fotografias extra-bucais do sorriso máximo de 27 pacientes com idade média de 10 anos e 3 meses. Foram realizadas fotografias do sorriso máximo nos períodos: inicial (antes da instalação do aparelho expansor); 3 meses após a fixação do parafuso expansor; 6 meses após a fixação do parafuso expansor. Para a calibragem e análise das fotografias foi utilizado o programa CEFX 2001 CDT. Os pontos fotométricos e as medidas a serem analisadas foram escolhidos após revisão da literatura do sorriso realizada. Para avaliar as alterações no sorriso durante as fases, foi utilizada a análise de variância ANOVA, com nível de significância de 5%. A expansão rápida da maxila promoveu aumento estatisticamente significante da dimensão transversal do sorriso; aumento da quantidade de exposição dos incisivos centrais e laterais superiores; manutenção da simetria entre os lados direito e esquerdo e da falta de paralelismo entre a curvatura das bordas dos incisivos superiores com a curvatura do lábio inferior (arco do sorriso).(AU)
Resumo:
This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.
Resumo:
This paper develops a productivity index applicable when producers are cost minimisers and input prices are known. The index is inspired by the Malmquist index as extended to productivity measurement. The index developed here is defined in terms of input cost rather than input quantity distance functions. Hence, productivity change is decomposed into overall efficiency and cost technical change. Furthermore, overall efficiency change is decomposed into technical and allocative efficiency change and cost technical change into a part capturing shifts of input quantities and shifts of relative input prices. These decompositions provide a clearer picture of the root sources of productivity change. They are illustrated here in a sample of hospitals; results are computed using non-parametric mathematical programming. © 2003 Elsevier B.V. All rights reserved.
Resumo:
Background: Re-use of unused medicines returned from patients is currently considered unethical in the UK and these are usually destroyed by incineration. Previous studies suggest that many of these medicines may be in a condition suitable for re-use. Methods: All medicines returned over two months to participating community pharmacies and GP surgeries in Eastern Birmingham PCT were assessed for type, quantity and value. A registered pharmacist assessed packs against set criteria to determine the suitability for possible re-use. Results: Nine hundred and thirty-four return events were made from 910 patients, comprising 3765 items worth £33 608. Cardiovascular drugs (1003, 27%) and those acting on the CNS (884, 24%) were most prevalent. Returned packs had a median of 17 months remaining before expiry and one-quarter of packs (1248 out of 4291) were suitable for possible re-use. One-third of those suitable for re-use (476 out of 1248) contained drugs in the latest WHO Essential Drugs List. Conclusion: Unused medicines are returned in substantial quantities and have considerable financial value, with many in a condition suitable for re-use. We consider it appropriate to reopen the debate on the potential for re-using these medicines in developing countries where medicines are not widely available and also within the UK. © The Author 2007, Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved.
Resumo:
INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.
Resumo:
Antisense oligodeoxynucleotides can selectively inhibit gene expression provided they are delivered to their target site successfully for a sufficient duration. Biodegradable microspheres have previously been developed for the potential systemic delivery of antisense oligodeoxynucleotides and offer an excellent strategy for central administration of antisense oligodeoxynucleotides, providing a sustained-release delivery system. Biodegradable microspheres were formulated to entrap antisense oligodeoxynucleotides for stereotaxic implantation into site-specific regions of the rat brain.Release profiles of antisense oligodeoxynucleotides from biodegradable microspheres over 56 days that were triphasic were observed with high molecular weight polymers. Antisense oligodeoxynucleotides loaded into microspheres (1-10μm) had a five-fold increase in cellular association with glial and neuronal cells compared to the naked molecule, which was partially due to a greater cellular accumulation as observed by a slower efflux profile. In vivo distribution studies of antisense oligodeoxynucleotides demonstrated that the use of microspheres provided a sustained-release over more than 2 days compared to 12 hours of the naked molecule. Efficacy of antisense oligodeoxynucleotides was demonstrated during locomotor activity investigations, which significantly reduced cocaine-induced locomotor activity, where no efficacy was demonstrated with microspheres, possibly attributed to antisense loading and measurements being taken during a lag phase of antisense oligodeoxynucleotide release. Biodegradable microspheres can be delivered site-specifically into the brain and provide sustained-release of antisense oligodeoxynucleotides, offering the potential of in vivo efficacy in these reagents in the brain.
Resumo:
This study expands the current knowledge base on the nature, causes and fate of unused medicines in primary care. Three methodologies were used and participants for each element were sampled from the population of Eastern Birmingham PCT. A detailed assessment was made of medicines returned to pharmacies and GP surgeries for destruction and a postal questionnaire covering medicines use and disposal was used to patients randomly selected from the electoral roll. The content of this questionnaire was informed by qualitative data from a group interview on the subject. By use of these three methods it was possible to triangulate the data, providing a comprehensive assessment of unused medicines. Unused medicines were found to be ubiquitous in primary care and cardiovascular, diabetic and respiratory medicines are unused in substantial quantities, accounting for a considerable proportion of the total financial value of all unused medicines. Additionally, analgesic and psychoactive medicines were highlighted as being unused in sufficient quantities for concern. Anti-infective medicines also appear to be present and unused in a substantial proportion of patients’ homes. Changes to prescribed therapy and non-compliance were identified as important factors leading to the generation of unused medicines. However, a wide array of other elements influence the quantities and types of medicines that are unused including the concordancy of GP consultations and medication reviews and patient factors such as age, sex or ethnicity. Medicines were appropriately discarded by 1 in 3 patients through return to a medical or pharmaceutical establishment. Inappropriate disposal was by placing in household refuse or through grey and black water with the possibility of hoarding or diversion also being identified. No correlations wre found between the weight of unused medicines and any clinical or financial factor. The study has highlighted unused medicines to be an issue of some concern and one that requires further study.
Resumo:
This paper aims to help supply chain managers to determine the value of retailer-supplier partnership initiatives beyond information sharing (IS) according to their specific business environment under time-varying demand conditions. For this purpose, we use integer linear programming models to quantify the benefits that can be accrued by a retailer, a supplier and system as a whole from shift in inventory ownership and shift in decision-making power with that of IS. The results of a detailed numerical study pertaining to static time horizon reveal that the shift in inventory ownership provides system-wide cost benefits in specific settings. Particularly, when it induces the retailer to order larger quantities and the supplier also prefers such orders due to significantly high setup and shipment costs. We observe that the relative benefits of shift in decision-making power are always higher than the shift in inventory ownership under all the conditions. The value of the shift in decision-making power is greater than IS particularly when the variability of underlying demand is low and time-dependent variation in production cost is high. However, when the shipment cost is negligible and order issuing efficiency of the supplier is low, the cost benefits of shift in decision-making power beyond IS are not significant. © 2012 Taylor & Francis.
Resumo:
Purpose: To optimize anterior eye fluorescein viewing and image capture. Design: Prospective experimental investigation. Methods: The spectral radiance of ten different models of slit-lamp blue luminance and the spectral transmission of three barrier filters were measured. Optimal clinical instillation of fluorescein was evaluated by a comparison of four different instillation methods of fluorescein into 10 subjects. Two methods used a floret, and two used minims of different concentration. The resulting fluorescence was evaluated for quenching effects and efficiency over time. Results: Spectral radiance of the blue illumination typically had an average peak at 460 nm. Comparison between three slit-lamps of the same model showed a similar spectral radiance distribution. Of the slit-lamps examined, 8.3% to 50.6% of the illumination output was optimized for >80% fluorescein excitation, and 1.2% to 23.5% of the illumination overlapped with that emitted by the fluorophore. The barrier filters had an average cut-off at 510 to 520 nm. Quenching was observed for all methods of fluorescein instillation. The moistened floret and the 1% minim reached a useful level of fluorescence in on average ∼20s (∼2.5× faster than the saturated floret and 2% minim) and this lasted for ∼160 seconds. Conclusions: Most slit-lamps' blue light and yellow barrier filters are not optimal for fluorescein viewing and capture. Instillation of fluorescein using a moistened floret or 1% minim seems most clinically appropriate as lower quantities and concentrations of fluorescein improve the efficiency of clinical examination. © 2006 Elsevier Inc. All rights reserved.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionize the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability. Designers of online-questionnaires are faced with a plethora of design tools to assist in the development of their electronic questionnaires. Little, if any, support is incorporated, however, within these tools to guide online-questionnaire designers according to best practice. In essence, an online-questionnaire combines questionnaire-based survey functionality with that of a webpage/site. As such, the design of an online-questionnaire should incorporate principles from both contributing fields. Drawing on existing guidelines for paper-based questionnaire design, website design (paying particular attention to issues of accessibility and usability), and existing but scarce guidelines for electronic surveys, we have derived a comprehensive set of guidelines for the design of online-questionnaires. This article introduces this comprehensive set of guidelines – as a practical reference guide – for the design of online-questionnaires.
Resumo:
Algorithmic resources are considered for elaboration and identification of monotone functions and some alternate structures are brought, which are more explicit in sense of structure and quantities and which can serve as elements of practical identification algorithms. General monotone recognition is considered on multi- dimensional grid structure. Particular reconstructing problem is reduced to the monotone recognition through the multi-dimensional grid partitioning into the set of binary cubes.