909 resultados para measurement and metrology


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Quality of life and well-being are frequently restricted in adults with neuromuscular disorders. As such, identification of appropriate interventions is imperative. Objective: The objective of this paper was to systematically review and critically appraise quantitative studies (RCTs, controlled trials and cohort studies) of psychosocial interventions designed to improve quality of life and well-being in adults with neuromuscular disorders. Method: A systematic review of the published and unpublished literature was conducted. Studies meeting inclusion criteria were appraised using a validated quality assessment tool and results presented in a narrative synthesis. Results: Out of 3,136 studies identified, ten studies met criteria for inclusion within the review. Included studies comprised a range of interventions including: cognitive behavioural therapy, dignity therapy, hypnosis, expressive disclosure, gratitude lists, group psychoeducation and psychologically informed rehabilitation. Five of the interventions were for patients with Amyotrophic Lateral Sclerosis (ALS). The remainder were for patients with post-polio syndrome, muscular dystrophies and mixed disorders, such as Charcot-Marie-Tooth disease, myasthenia gravis and myotonic dystrophy. Across varied interventions and neuromuscular disorders, seven studies reported a short-term beneficial effect of intervention on quality of life and well-being. Whilst such findings are encouraging, widespread issues with the methodological quality of these studies significantly compromised the results. Conclusion: There is no strong evidence that psychosocial interventions improve quality of life and well-being in adults with neuromuscular disorders, due to a paucity of high quality research in this field. Multi-site, randomised controlled trials with active controls, standardised outcome measurement and longer term follow-ups are urgently required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In 2014, the Third International Conference on the resilience of social-ecological systems chose the theme “resilience and development: mobilizing for transformation.” The conference aimed specifically at fostering an encounter between the experiences and thinking focused on the issue of resilience through a social and ecological system perspective, and the experiences focused on the issue of resilience through a development perspective. In this perspectives piece, we reflect on the outcomes of the meeting and document the differences and similarities between the two perspectives as discussed during the conference, and identify bridging questions designed to guide future interactions. After the conference, we read the documents (abstracts, PowerPoints) that were prepared and left in the conference database by the participants (about 600 contributions), and searched the web for associated items, such as videos, blogs, and tweets from the conference participants. All of these documents were assessed through one lens: what do they say about resilience and development? Once the perspectives were established, we examined different themes that were significantly addressed during the conference. Our analysis paves the way for new collective developments on a set of issues: (1) Who declares/assign/cares for the resilience of what, of whom? (2) What are the models of transformations and how do they combine the respective role of agency and structure? (3) What are the combinations of measurement and assessment processes? (4) At what scale should resilience be studied? Social transformations and scientific approaches are coconstructed. For the last decades, development has been conceived as a modernization process supported by scientific rationality and technical expertise. The definition of a new perspective on development goes with a negotiation on a new scientific approach. Resilience is presently at the center of this negotiation on a new science for development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Organizations within the public- and the private sector have different aims with their accounting. Privately held organizations often have the intention to make profit, while authorities within the public sector aim to provide citizens with different services. The difference between these two sectors is also visible in the legislation, where International Accounting Standards Board set the standards for privately held organizations and Ekonomistyrningsverket do the same for the public sector. Because of the larger demand by the society for knowledge and technology, included in the category intangible assets, these are more emphasized in the accounting for organizations. Intangible assets are although linked with complexity which is associated with the measurement. The purpose of this study is therefore to see whether there are any existing differences in how to measure and value intangible assets and internally generated ones between listed companies in the private sector and authorities operating in the public sector. This study is conducted with both a qualitative and quantitative perspective. The data collected for this study is secondary, and is gathered through samples of annual reports from different companies in order to be representative for the whole population. The main results of this study is that there are differences in the measurement and valuation of intangible assets dependent on which sector an organization operates within and this is not due to the standards and regulations. The differences are visible in the percentage change in value of intangible assets since they fluctuate more heavily in the private sector than in the public sector. Simultaneously, the proportions of internally generated intangible assets compared to intangible assets in general differ between the two sectors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many applications, including communications, test and measurement, and radar, require the generation of signals with a high degree of spectral purity. One method for producing tunable, low-noise source signals is to combine the outputs of multiple direct digital synthesizers (DDSs) arranged in a parallel configuration. In such an approach, if all noise is uncorrelated across channels, the noise will decrease relative to the combined signal power, resulting in a reduction of sideband noise and an increase in SNR. However, in any real array, the broadband noise and spurious components will be correlated to some degree, limiting the gains achieved by parallelization. This thesis examines the potential performance benefits that may arise from using an array of DDSs, with a focus on several types of common DDS errors, including phase noise, phase truncation spurs, quantization noise spurs, and quantizer nonlinearity spurs. Measurements to determine the level of correlation among DDS channels were made on a custom 14-channel DDS testbed. The investigation of the phase noise of a DDS array indicates that the contribution to the phase noise from the DACs can be decreased to a desired level by using a large enough number of channels. In such a system, the phase noise qualities of the source clock and the system cost and complexity will be the main limitations on the phase noise of the DDS array. The study of phase truncation spurs suggests that, at least in our system, the phase truncation spurs are uncorrelated, contrary to the theoretical prediction. We believe this decorrelation is due to the existence of an unidentified mechanism in our DDS array that is unaccounted for in our current operational DDS model. This mechanism, likely due to some timing element in the FPGA, causes some randomness in the relative phases of the truncation spurs from channel to channel each time the DDS array is powered up. This randomness decorrelates the phase truncation spurs, opening the potential for SFDR gain from using a DDS array. The analysis of the correlation of quantization noise spurs in an array of DDSs shows that the total quantization noise power of each DDS channel is uncorrelated for nearly all values of DAC output bits. This suggests that a near N gain in SQNR is possible for an N-channel array of DDSs. This gain will be most apparent for low-bit DACs in which quantization noise is notably higher than the thermal noise contribution. Lastly, the measurements of the correlation of quantizer nonlinearity spurs demonstrate that the second and third harmonics are highly correlated across channels for all frequencies tested. This means that there is no benefit to using an array of DDSs for the problems of in-band quantizer nonlinearities. As a result, alternate methods of harmonic spur management must be employed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective To develop a structurally valid and reliable, yet brief measure of patient experience of hospital quality of care, the Care Experience Feedback Improvement Tool (CEFIT). Also, to examine aspects of utility of CEFIT. Background Measuring quality improvement at the clinical interface has become a necessary component of healthcare measurement and improvement plans, but the effectiveness of measuring such complexity is dependent on the purpose and utility of the instrument used. Methods CEFIT was designed from a theoretical model, derived from the literature and a content validity index (CVI) procedure. A telephone population surveyed 802 eligible participants (healthcare experience within the previous 12 months) to complete CEFIT. Internal consistency reliability was tested using Cronbach's α. Principal component analysis was conducted to examine the factor structure and determine structural validity. Quality criteria were applied to judge aspects of utility. Results CVI found a statistically significant proportion of agreement between patient and practitioner experts for CEFIT construction. 802 eligible participants answered the CEFIT questions. Cronbach's α coefficient for internal consistency indicated high reliability (0.78). Interitem (question) total correlations (0.28–0.73) were used to establish the final instrument. Principal component analysis identified one factor accounting for 57.3% variance. Quality critique rated CEFIT as fair for content validity, excellent for structural validity, good for cost, poor for acceptability and good for educational impact. Conclusions CEFIT offers a brief yet structurally sound measure of patient experience of quality of care. The briefness of the 5-item instrument arguably offers high utility in practice. Further studies are needed to explore the utility of CEFIT to provide a robust basis for feedback to local clinical teams and drive quality improvement in the provision of care experience for patients. Further development of aspects of utility is also required.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The impact of end customer quality complaints with direct relationship with automotive components has presented negative trend at European level for the entire automotive industry. Thus, this research proposal is to concentrate efforts on the most important items of Pareto chart and understand the failure type and the mechanism involved, link and impact of the project and parameters on the process, ending it with the development of one of the company’s most desired tool, that hosted this project – European methodology of terminals defects classification, and listing real opportunities for improvement based on measurement and analysis of actual data. Through the development of terminals defects classification methodology, which is considered a valuable asset to the company, all the other companies of the YAZAKI’s group will be able to characterize terminals as brittle or ductile, in order to put in motion, more efficiently, all the other different existing internal procedures for the safeguarding of the components, improving manufacturing efficiency. Based on a brief observation, nothing can be said in absolute sense, concerning the failure causes. Base materials, project, handling during manufacture and storage, as well as the cold work performed by plastic deformation, all play an important role. However, it was expected that this failure has been due to a combination of factors, in detriment of the existence of a single cause. In order to acquire greater knowledge about this problem, unexplored by the company up to the date of commencement of this study, was conducted a thorough review of existing literature on the subject, real production sites were visited and, of course, the actual parts were tested in lab environment. To answer to many of the major issues raised throughout the investigation, were used extensively some theoretical concepts focused on the literature review, with a view to realizing the relationship existing between the different parameters concerned. Should here be stated that finding technical studies on copper and its alloys is really hard, not being given all the desirable information. This investigation has been performed as a YAZAKI Europe Limited Company project and as a Master Thesis for Instituto Superior de Engenharia do Porto, conducted during 9 months between 2012/2013.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O trabalho desenvolvido centrou-se na preparação da acreditação NP EN ISO/IEC 17025 do Laboratório de Metrologia da empresa Frilabo para prestação de serviços na área das temperaturas, no ensaio a câmaras térmicas e na calibração de termómetros industriais. Considerando o âmbito do trabalho desenvolvido, são abordados nesta tese conceitos teóricos sobre temperatura e incertezas bem como considerações técnicas de medição da temperatura e cálculo de incertezas. São também referidas considerações sobre os diferentes tipos de câmaras térmicas e termómetros. O texto apresenta os documentos elaborados pelo autor sobre os procedimentos de ensaio a câmaras térmicas e respetivo procedimento de cálculo da incerteza. Também estão presentes neste texto documentos elaborados pelo autor sobre os procedimentos de calibração de termómetros industriais e respetivo procedimento de cálculo da incerteza. Relativamente aos ensaios a câmara térmicas e calibração de termómetros o autor elaborou os fluxogramas sobre a metodologia da medição da temperatura nos ensaios, a metodologia de medição da temperatura nas calibrações, e respetivos cálculos de incertezas. Nos diferentes anexos estão apresentados vários documentos tais como o modelo de folha de cálculo para tratamento de dados relativos ao ensaio, modelo de folha de cálculo para tratamento de dados relativo às calibrações, modelo de relatório de ensaio, modelo de certificado de calibração, folhas de cálculo para gestão de clientes/equipamentos e numeração automática de relatórios de ensaio e certificados de calibração que cumprem os requisitos de gestão do laboratório. Ainda em anexo constam todas as figuras relativas à monitorização da temperatura nas câmara térmicas como também as figuras da disposição dos termómetros no interior das câmaras térmicas. Todas as figuras que aparecem ao longo do documento que não estão referenciadas são da adaptação ou elaboração própria do autor. A decisão de alargar o âmbito da acreditação do Laboratório de Metrologia da Frilabo para calibração de termómetros, prendeu-se com o facto de que sendo acreditado como laboratório de ensaios na área das temperaturas, a realização da rastreabilidade dos padrões de medida internamente, permitiria uma gestão de recursos otimizada e rentabilizada. A metodologia da preparação de todo o processo de acreditação do Laboratório de Metrologia da Frilabo, foi desenvolvida pelo autor e está expressa ao longo do texto da tese incluindo dados relevantes para a concretização da referida acreditação nos dois âmbitos. A avaliação de todo o trabalho desenvolvido será efetuada pelo o organismo designado IPAC (Instituto Português de Acreditação) que confere a acreditação em Portugal. Este organismo irá auditar a empresa com base nos procedimentos desenvolvidos e nos resultados obtidos, sendo destes o mais importante o Balanço da Melhor Incerteza (BMI) da medição também conhecido por Melhor Capacidade de Medição (MCM), quer para o ensaio às câmaras térmicas, quer para a calibração dos termómetros, permitindo desta forma complementar os serviços prestados aos clientes fidelizados à Frilabo. As câmaras térmicas e os termómetros industriais são equipamentos amplamente utilizados em diversos segmentos industriais, engenharia, medicina, ensino e também nas instituições de investigação, sendo um dos objetivos respetivamente, a simulação de condições específicas controladas e a medição de temperatura. Para entidades acreditadas, como os laboratórios, torna-se primordial que as medições realizadas com e nestes tipos de equipamentos ostentem confiabilidade metrológica1, uma vez que, resultados das medições inadequados podem levar a conclusões equivocadas sobre os testes realizados. Os resultados obtidos nos ensaios a câmaras térmicas e nas calibrações de termómetros, são considerados bons e aceitáveis, uma vez que as melhores incertezas obtidas, podem ser comparadas, através de consulta pública do Anexo Técnico do IPAC, com as incertezas de outros laboratórios acreditados em Portugal. Numa abordagem mais experimental, pode dizer-se que no ensaio a câmaras térmicas a obtenção de incertezas mais baixas ou mais altas depende maioritariamente do comportamento, características e estado de conservação das câmaras, tornando relevante o processo de estabilização da temperatura no interior das mesmas. A maioria das fontes de incerteza na calibração dos termómetros são obtidas pelas características e especificações do fabricante dos equipamentos, que se traduzem por uma contribuição com o mesmo peso para o cálculo da incerteza expandida (a exatidão de fabricante, as incertezas herdadas de certificados de calibração, da estabilidade e da uniformidade do meio térmico onde se efetuam as calibrações). Na calibração dos termómetros as incertezas mais baixas obtêm-se para termómetros de resoluções mais baixas. Verificou-se que os termómetros com resolução de 1ºC não detetavam as variações do banho térmico. Nos termómetros com resoluções inferiores, o peso da contribuição da dispersão de leituras no cálculo da incerteza, pode variar consoante as características do termómetro. Por exemplo os termómetros com resolução de 0,1ºC, apresentaram o maior peso na contribuição da componente da dispersão de leituras. Pode concluir-se que a acreditação de um laboratório é um processo que não é de todo fácil. Podem salientar-se aspetos que podem comprometer a acreditação, como por exemplo a má seleção do ou dos técnicos e equipamentos (má formação do técnico, equipamento que não seja por exemplo adequado à gama, mal calibrado, etc…) que vão efetuar as medições. Se não for bem feita, vai comprometer todo o processo nos passos seguintes. Deve haver também o envolvimento do todos os intervenientes do laboratório, o gestor da qualidade, o responsável técnico e os técnicos, só assim é que é possível chegar à qualidade pretendida e à melhoria contínua da acreditação do laboratório. Outro aspeto importante na preparação de uma acreditação de um laboratório é a pesquisa de documentação necessária e adequada para poder tomar decisões corretas na elaboração dos procedimentos conducentes à referida. O laboratório tem de mostrar/comprovar através de registos a sua competência. Finalmente pode dizer-se que competência é a palavra chave de uma acreditação, pois ela manifesta-se nas pessoas, equipamentos, métodos, instalações e outros aspetos da instituição a que pertence o laboratório sob acreditação.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this study was to evaluate the chemical, color, textural, and sensorial characteristics of Serra da Estrela cheese and also to identity the factors affecting these properties, namely thistle ecotype, place of production, dairy and maturation. The results demon- strated that the cheeses lost weight mostly during the first stage of maturation, which was negatively correlated with moisture content, being this also observed for fat and protein contents. During maturation the cheeses became darker and with a yellowish coloration. A strong corre- lation was found between ash and chlorides contents, being the last directly related to the added salt in the manufacturing process. The flesh firmness showed a strong positive correlation with the rind harness and the firmness of inner paste. Stickiness was strongly related with all the other textural properties being indicative of the creamy nature of the paste. Adhesiveness was posi- tively correlated with moisture content and negatively correlated with maturation time. The trained panelists liked the cheeses, giving high overall assessment scores, but these were not significantly correlated with the physicochemical properties. The salt differences between cheeses were not evident for the panelists, which was corroborated by the absence of correlation between the perception of saltiness and the analyzed chlorides con- tents. The Factorial Analysis of the chemical and physical properties evidenced that they could be explained by two factors, one associated to the texture and the color and the other associated with the chemical properties. Finally, there was a clear influence of the thistle ecotype, place of production and dairy factors in the analyzed properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The achievement and measurement of improvements and innovations is not often an overt practice in the design and delivery of government services other than in health services. There is a need for specific mechanisms proven to increase the rate and scale of improvements and innovations in organisations, communities, regions and industries. This paper describes a model for the design, measurement and management of projects and services as systems for achieving and sustaining outcomes, improvements and innovations.The development of the model involved the practice of continuous improvement and innovation within and across a number of agricultural development projects in Australia and nternationally. Key learnings from the development and use of the model are: (1) all elements and factors critical for success can be implemented, measured and managed; (2) the design of a meaningful systemic measurement framework is possible; (3) all project partners can achieve and sustain rapid improvements and innovations; (4) outcomes can be achieved from early in the life of projects; and (5) significant spill-over benefits can be achieved beyond the scope, scale and timeframe of projects

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In 1935, Einstein, Podolsky and Rosen (EPR) questioned the completeness of quantum mechanics by devising a quantum state of two massive particles with maximally correlated space and momentum coordinates. The EPR criterion qualifies such continuous-variable entangled states, where a measurement of one subsystem seemingly allows for a prediction of the second subsystem beyond the Heisenberg uncertainty relation. Up to now, continuous-variable EPR correlations have only been created with photons, while the demonstration of such strongly correlated states with massive particles is still outstanding. Here we report on the creation of an EPR-correlated two-mode squeezed state in an ultracold atomic ensemble. The state shows an EPR entanglement parameter of 0.18(3), which is 2.4 s.d. below the threshold 1/4 of the EPR criterion. We also present a full tomographic reconstruction of the underlying many-particle quantum state. The state presents a resource for tests of quantum nonlocality and a wide variety of applications in the field of continuous-variable quantum information and metrology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tutkittu yritys on suomalainen maaleja ja lakkoja kansainvälisesti valmistava ja myyvä toimija. Yrityksessä otettiin vuonna 2010 käyttöön uudet tuotannon ja toimitusketjun tavoitteet ja suunnitelmat ja tämä tutkimus on osa tuota kokonaisvaltaista kehittämissuuntaa. Tutkimuksessa käsitellään tuotannon ja kunnossapidon tehokkuuden parantamis- ja mittaustyökalu OEE:tä ja tuotevaihtoaikojen pienentämiseen tarkoitettua SMED -työkalua. Työn teoriaosuus perustuu lähinnä akateemisiin julkaisuihin, mutta myös haastatteluihin, kirjoihin, internet sivuihin ja yhteen vuosikertomukseen. Empiriaosuudessa OEE:n käyttöönoton ongelmia ja onnistumista tutkittiin toistettavalla käyttäjäkyselyllä. OEE:n potentiaalia ja käyttöönottoa tutkittiin myös tarkastelemalla tuotanto- ja käytettävyysdataa, jota oli kerätty tuotantolinjalta. SMED:iä tutkittiin siihen perustuvan tietokoneohjelman avulla. SMED:iä tutkittiin teoreettisella tasolla, eikä sitä implementoitu vielä käytäntöön. Tutkimustuloksien mukaan OEE ja SMED sopivat hyvin esimerkkiyritykselle ja niissä on paljon potentiaalia. OEE ei ainoastaan paljasta käytettävyyshäviöiden määrää, mutta myös niiden rakenteen. OEE -tulosten avulla yritys voi suunnata rajalliset tuotannon ja kunnossapidon parantamisen resurssit oikeisiin paikkoihin. Työssä käsiteltävä tuotantolinja ei tuottanut mitään 56 % kaikesta suunnitellusta tuotantoajasta huhtikuussa 2016. Linjan pysähdyksistä ajallisesti 44 % johtui vaihto-, aloitus- tai lopetustöistä. Tuloksista voidaan päätellä, että käytettävyyshäviöt ovat vakava ongelma yrityksen tuotannontehokkuudessa ja vaihtotöiden vähentäminen on tärkeä kehityskohde. Vaihtoaikaa voitaisiin vähentää ~15 % yksinkertaisilla ja halvoilla SMED:illä löydetyillä muutoksilla työjärjestyksessä ja työkaluissa. Parannus olisi vielä suurempi kattavimmilla muutoksilla. SMED:in suurin potentiaali ei välttämättä ole vaihtoaikojen lyhentämisessä vaan niiden standardisoinnissa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

If a bathymetric echosounder is the essential device to carry on hydrographic surveys, other external sensors are absolutely also necessary (positioning system, motion unit or sound velocity profiler). And because sound doesn‛t go straight away into the whole bathymetric swath its measurement and processing are very sensitive for all the water column. DORIS is the very answer for an operational sound velocity profile processing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

International audience

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study mainly aims to provide an inter-industry analysis through the subdivision of various industries in flow of funds (FOF) accounts. Combined with the Financial Statement Analysis data from 2004 and 2005, the Korean FOF accounts are reconstructed to form "from-whom-to-whom" basis FOF tables, which are composed of 115 institutional sectors and correspond to tables and techniques of input–output (I–O) analysis. First, power of dispersion indices are obtained by applying the I–O analysis method. Most service and IT industries, construction, and light industries in manufacturing are included in the first quadrant group, whereas heavy and chemical industries are placed in the fourth quadrant since their power indices in the asset-oriented system are comparatively smaller than those of other institutional sectors. Second, investments and savings, which are induced by the central bank, are calculated for monetary policy evaluations. Industries are bifurcated into two groups to compare their features. The first group refers to industries whose power of dispersion in the asset-oriented system is greater than 1, whereas the second group indicates that their index is less than 1. We found that the net induced investments (NII)–total liabilities ratios of the first group show levels half those of the second group since the former's induced savings are obviously greater than the latter.