890 resultados para Tchebyshev metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The purpose of this study is to validate the Pulvers silhouette showcard as a measure of weight status in a population in the African region. This tool is particularly beneficial when scarce resources do not allow for direct anthropometric measurements due to limited survey time or lack of measurement technology in face-to-face general-purpose surveys or in mailed, online, or mobile device-based surveys. METHODS: A cross-sectional study was conducted in the Republic of Seychelles with a sample of 1240 adults. We compared self-reported body sizes measured by Pulvers' silhouette showcards to four measurements of body size and adiposity: body mass index (BMI), body fat percent measured, waist circumference, and waist to height ratio. The accuracy of silhouettes as an obesity indicator was examined using sex-specific receiver operator curve (ROC) analysis and the reliability of this tool to detect socioeconomic gradients in obesity was compared to BMI-based measurements. RESULTS: Our study supports silhouette body size showcards as a valid and reliable survey tool to measure self-reported body size and adiposity in an African population. The mean correlation coefficients of self-reported silhouettes with measured BMI were 0.80 in men and 0.81 in women (P < 0.001). The silhouette showcards also showed high accuracy for detecting obesity as per a BMI ≥ 30 (Area under curve, AUC: 0.91/0.89, SE: 0.01), which was comparable to other measured adiposity indicators: fat percent (AUC: 0.94/0.94, SE: 0.01), waist circumference (AUC: 0.95/0.94, SE: 0.01), and waist to height ratio (AUC: 0.95/0.94, SE: 0.01) amongst men and women, respectively. The use of silhouettes in detecting obesity differences among different socioeconomic groups resulted in similar magnitude, direction, and significance of association between obesity and socioeconomic status as when using measured BMI. CONCLUSIONS: This study highlights the validity and reliability of silhouettes as a survey tool for measuring obesity in a population in the African region. The ease of use and cost-effectiveness of this tool makes it an attractive alternative to measured BMI in the design of non-face-to-face online- or mobile device-based surveys as well as in-person general-purpose surveys of obesity in social sciences, where limited resources do not allow for direct anthropometric measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the availability of mesohabitats in given periods that determine the presence of different biotic assemblages. This novel concept links hydrological and ecological conditions in a unique way. All these methods were implemented with data from eight temporary streams around the Mediterranean within the MIRAGE project. Their application was a precondition to assessing the ecological quality of these streams.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tomodensitométrie (TDM) est une technique d'imagerie pour laquelle l'intérêt n'a cessé de croitre depuis son apparition au début des années 70. De nos jours, l'utilisation de cette technique est devenue incontournable, grâce entre autres à sa capacité à produire des images diagnostiques de haute qualité. Toutefois, et en dépit d'un bénéfice indiscutable sur la prise en charge des patients, l'augmentation importante du nombre d'examens TDM pratiqués soulève des questions sur l'effet potentiellement dangereux des rayonnements ionisants sur la population. Parmi ces effets néfastes, l'induction de cancers liés à l'exposition aux rayonnements ionisants reste l'un des risques majeurs. Afin que le rapport bénéfice-risques reste favorable au patient il est donc nécessaire de s'assurer que la dose délivrée permette de formuler le bon diagnostic tout en évitant d'avoir recours à des images dont la qualité est inutilement élevée. Ce processus d'optimisation, qui est une préoccupation importante pour les patients adultes, doit même devenir une priorité lorsque l'on examine des enfants ou des adolescents, en particulier lors d'études de suivi requérant plusieurs examens tout au long de leur vie. Enfants et jeunes adultes sont en effet beaucoup plus sensibles aux radiations du fait de leur métabolisme plus rapide que celui des adultes. De plus, les probabilités des évènements auxquels ils s'exposent sont également plus grandes du fait de leur plus longue espérance de vie. L'introduction des algorithmes de reconstruction itératifs, conçus pour réduire l'exposition des patients, est certainement l'une des plus grandes avancées en TDM, mais elle s'accompagne de certaines difficultés en ce qui concerne l'évaluation de la qualité des images produites. Le but de ce travail est de mettre en place une stratégie pour investiguer le potentiel des algorithmes itératifs vis-à-vis de la réduction de dose sans pour autant compromettre la qualité du diagnostic. La difficulté de cette tâche réside principalement dans le fait de disposer d'une méthode visant à évaluer la qualité d'image de façon pertinente d'un point de vue clinique. La première étape a consisté à caractériser la qualité d'image lors d'examen musculo-squelettique. Ce travail a été réalisé en étroite collaboration avec des radiologues pour s'assurer un choix pertinent de critères de qualité d'image. Une attention particulière a été portée au bruit et à la résolution des images reconstruites à l'aide d'algorithmes itératifs. L'analyse de ces paramètres a permis aux radiologues d'adapter leurs protocoles grâce à une possible estimation de la perte de qualité d'image liée à la réduction de dose. Notre travail nous a également permis d'investiguer la diminution de la détectabilité à bas contraste associée à une diminution de la dose ; difficulté majeure lorsque l'on pratique un examen dans la région abdominale. Sachant que des alternatives à la façon standard de caractériser la qualité d'image (métriques de l'espace Fourier) devaient être utilisées, nous nous sommes appuyés sur l'utilisation de modèles d'observateurs mathématiques. Nos paramètres expérimentaux ont ensuite permis de déterminer le type de modèle à utiliser. Les modèles idéaux ont été utilisés pour caractériser la qualité d'image lorsque des paramètres purement physiques concernant la détectabilité du signal devaient être estimés alors que les modèles anthropomorphes ont été utilisés dans des contextes cliniques où les résultats devaient être comparés à ceux d'observateurs humain, tirant profit des propriétés de ce type de modèles. Cette étude a confirmé que l'utilisation de modèles d'observateurs permettait d'évaluer la qualité d'image en utilisant une approche basée sur la tâche à effectuer, permettant ainsi d'établir un lien entre les physiciens médicaux et les radiologues. Nous avons également montré que les reconstructions itératives ont le potentiel de réduire la dose sans altérer la qualité du diagnostic. Parmi les différentes reconstructions itératives, celles de type « model-based » sont celles qui offrent le plus grand potentiel d'optimisation, puisque les images produites grâce à cette modalité conduisent à un diagnostic exact même lors d'acquisitions à très basse dose. Ce travail a également permis de clarifier le rôle du physicien médical en TDM: Les métriques standards restent utiles pour évaluer la conformité d'un appareil aux requis légaux, mais l'utilisation de modèles d'observateurs est inévitable pour optimiser les protocoles d'imagerie. -- Computed tomography (CT) is an imaging technique in which interest has been quickly growing since it began to be used in the 1970s. Today, it has become an extensively used modality because of its ability to produce accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase in the number of CT examinations performed has raised concerns about the potential negative effects of ionising radiation on the population. Among those negative effects, one of the major risks remaining is the development of cancers associated with exposure to diagnostic X-ray procedures. In order to ensure that the benefits-risk ratio still remains in favour of the patient, it is necessary to make sure that the delivered dose leads to the proper diagnosis without producing unnecessarily high-quality images. This optimisation scheme is already an important concern for adult patients, but it must become an even greater priority when examinations are performed on children or young adults, in particular with follow-up studies which require several CT procedures over the patient's life. Indeed, children and young adults are more sensitive to radiation due to their faster metabolism. In addition, harmful consequences have a higher probability to occur because of a younger patient's longer life expectancy. The recent introduction of iterative reconstruction algorithms, which were designed to substantially reduce dose, is certainly a major achievement in CT evolution, but it has also created difficulties in the quality assessment of the images produced using those algorithms. The goal of the present work was to propose a strategy to investigate the potential of iterative reconstructions to reduce dose without compromising the ability to answer the diagnostic questions. The major difficulty entails disposing a clinically relevant way to estimate image quality. To ensure the choice of pertinent image quality criteria this work was continuously performed in close collaboration with radiologists. The work began by tackling the way to characterise image quality when dealing with musculo-skeletal examinations. We focused, in particular, on image noise and spatial resolution behaviours when iterative image reconstruction was used. The analyses of the physical parameters allowed radiologists to adapt their image acquisition and reconstruction protocols while knowing what loss of image quality to expect. This work also dealt with the loss of low-contrast detectability associated with dose reduction, something which is a major concern when dealing with patient dose reduction in abdominal investigations. Knowing that alternative ways had to be used to assess image quality rather than classical Fourier-space metrics, we focused on the use of mathematical model observers. Our experimental parameters determined the type of model to use. Ideal model observers were applied to characterise image quality when purely objective results about the signal detectability were researched, whereas anthropomorphic model observers were used in a more clinical context, when the results had to be compared with the eye of a radiologist thus taking advantage of their incorporation of human visual system elements. This work confirmed that the use of model observers makes it possible to assess image quality using a task-based approach, which, in turn, establishes a bridge between medical physicists and radiologists. It also demonstrated that statistical iterative reconstructions have the potential to reduce the delivered dose without impairing the quality of the diagnosis. Among the different types of iterative reconstructions, model-based ones offer the greatest potential, since images produced using this modality can still lead to an accurate diagnosis even when acquired at very low dose. This work has clarified the role of medical physicists when dealing with CT imaging. The use of the standard metrics used in the field of CT imaging remains quite important when dealing with the assessment of unit compliance to legal requirements, but the use of a model observer is the way to go when dealing with the optimisation of the imaging protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Open educational resources (OER) promise increased access, participation, quality, and relevance, in addition to cost reduction. These seemingly fantastic promises are based on the supposition that educators and learners will discover existing resources, improve them, and share the results, resulting in a virtuous cycle of improvement and re-use. By anecdotal metrics, existing web scale search is not working for OER. This situation impairs the cycle underlying the promise of OER, endangering long term growth and sustainability. While the scope of the problem is vast, targeted improvements in areas of curation, indexing, and data exchange can improve the situation, and create opportunities for further scale. I explore the way the system is currently inadequate, discuss areas for targeted improvement, and describe a prototype system built to test these ideas. I conclude with suggestions for further exploration and development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

JXTA is a peer-to-peer (P2P) middleware whichhas undergone successive iterations through its 10 years of history, slowly incorporating a security baseline that may cater to different applications and services. However, in order to appeal to a broader set of secure scenarios, it would be interesting to take into consideration more advanced capabilities, such as anonymity.There are several proposals on anonymous protocols that can be applied in the context of a P2P network, but it is necessary to be able to choose the right one given each application¿s needs. In this paper, we provide an experimental evaluation of two relevant protocols, each one belonging to a different category of approaches to anonymity: unimessage and split message. Webase our analysis on two scenarios, with stable and non-stable peers, and three metrics: round trip-time (RTT), node processing time and reliability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inventory data management is defined as an accurate creation and maintenance of item master data, inventory location data and inventory balances per an inventory location. The accuracy of inventory data is dependent of many elements of the product data management like the management of changes during a component’s life-cycle and the accuracy of product configuration in enterprise resource planning systems. Cycle-counting is counting of inventory balances per an inventory location and comparing them to the system data on a daily basis. The Cycle-counting process is a way to measure the accuracy of the inventory data in a company. Through well managed cycle-counting process a company gets a lot of information of their inventory data accuracy. General inaccuracy of the inventory data can’t be fixed through only assigning resources to the cycle-counting but the change requires disciplined following of the defined processes from all parties involved in updating inventory data through a component’s life-cycle. The Processes affecting inventory data are mapped and the appropriate metrics are defined in order to achieve better manageability of the inventory data. The Life-cycles of a single component and of a product are used in evaluation of the processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and purpose: In planning to meet evidence based needs for radiotherapy, guidelines for the provision of capital and human resources are central if access, quality and safety are not to be compromised. A component of the ESTRO-HERO (Health Economics in Radiation Oncology) project is to document the current availability and content of guidelines for radiotherapy in Europe. Materials and methods: An 84 part questionnaire was distributed to the European countries through their national scientific and professional radiotherapy societies with 30 items relating to the availability of guidelines for equipment and staffing and selected operational issues. Twenty-nine countries provided full or partial evaluable responses. Results: The availability of guidelines across Europe is far from uniform. The metrics used for capital and human resources are variable. There seem to have been no major changes in the availability or specifics of guidelines over the ten-year period since the QUARTS study with the exception of the recent expansion of RTT staffing models. Where comparison is possible it appears that staffing for radiation oncologists, medical physicists and particularly RTTs tend to exceed guidelines suggesting developments in clinical radiotherapy are moving faster than guideline updating. Conclusion: The efficient provision of safe, high quality radiotherapy services would benefit from the availability of well-structured guidelines for capital and human resources, based on agreed upon metrics, which could be linked to detailed estimates of need

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä diplomityössä tarkastellaan teollisuusyrityksen suorituskykyä asiakkaiden näkökulmasta. Työn tavoitteena on rakentaa kohdeyritykselle yrityksen suorituskykyä asiakkaan näkökulmasta kuvaava mittaristo. Työn kirjallisuustutkimuksen perusteella tunnistetaan olennaiset menetelmät asiakaspalvelutarpeiden mittaamiseen ja analysointiin. Lisäksi työssä esitellään suorituskyvyn mittariston rakentamisen mallit ja periaatteet, sekä mittariston implementointiin vaikuttavat tekijät. Empiirisessä osassa toteutetaan avainasiakkaiden palvelutarpeiden ja yrityksen suoritustason kartoitus ja analysointi. Kohdeyritykselle rakennetaan mittariston asiakaspalvelutarpeiden perusteella, soveltuen yrityksen toimintaympäristöön ja strategisiin päämääriin. Mittaristo rakennetaa Balanced Scorecardin asiakasnäkökulman asiakaslupausmittareiden pohjalle. Työn tuloksena kohdeyritys saa kuvan palvelutasostaan. Työssä esitetyn asiakaslupausmittariston avulla kohdeyritys voi seurata ja kehittää suorituskykyään. Lisäksi työssä tuodaan ilmi tärkeitä tekijöitä, jotka vaikuttavat mittariston implementointiin ja esitetään suositusten muodossa toimintaperiaatteita implementointiprosessissa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En la preparación de todo proyecto existe una estimación de costes de los diferentes puntos a realizar. Las métricas del software pueden ser de: productividad, calidad, técnicas, orientadas al tamaño, orientadas a la función u orientadas a la persona. Este documento tratará sobre las métricas del software, que se centran en el rendimiento del proceso de la ingeniería del software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis studies capital structure of Finnish small and medium sized enterprises. The specific object of the study is to test whether financial constraints have an effect on capital structure. In addition influences of several other factors were studied. Capital structure determinants are formulated based on three capital structure theories. The tradeoff theory and the agency theory concentrate on the search of optimal capital structure. The pecking order theory concerns favouring on financing source over another. The data of this study consists of financial statement data and results of corporate questionnaire. Regression analysis was used to find out the effects of several determinants. Regression models were formed based on the presented theories. Short and long term debt ratios were considered separately. The metrics of financially constrained firms was included in all models. It was found that financial constrains have a negative and significant effect to short term debt ratios. The effect was negative also to long term debt ratio but not statistically significant. Other considerable factors that influenced debt ratios were fixed assets, age, profitability, single owner and sufficiency of internal financing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tavoitteena oli tutkia, miten voidaan osoittaa innovaatiopohjaisten tutkimus- ja kehittämismenetelmien hyödyt organisaatioiden innovaatiokyvykkyyteen. Tutkimuksen tarkoitus oli kehittää suorituskyvyn johtamisen näkökulmasta viitekehys innovaatiokyvykkyyden ja sen vaikutusten mittaamiseen. Empiirinen aineisto kerättiin workshoppien, haastattelujen ja ryhmätyösessioiden avulla. Innovaatiokyvykkyyden kehittäminen on nykyisin keskeisessä asemassa, kun organisaatiot toimivat hyvin haasteellisissa toimintaympäristöissä. Kuitenkin innovaatiokyvykkyyden mittaaminen organisaatioissa on hyvin harvinaista muun muassa mittaamisen haasteellisuuden ja abstraktin luonteen vuoksi. Mittaaminen on kuitenkin oleellinen osa innovaatiokyvykkyyden kehittämistä ja siten tärkeää organisaatioiden tulevaisuuden menestyksen kannalta. Tutkimuksen tuloksena syntyi viitekehys innovaatiomenetelmien vaikutusten arviointiin. Viitekehys koostuu viidestä näkökulmasta. Innovatiivisen suorituskyvyn näkökulmassa mitataan innovaatiokyvykkyyden taustatekijöitä ja innovaatiotoiminnan tuloksia. Lisäksi talous-, asiakas-, sisäisten prosessien ja henkilöstön näkökulmasta mitataan innovaatiokyvykkyyden kehityksen vaikutuksia organisaation toimintaan. Mittariston tavoitteet asetetaan innovaatiomenetelmien soveltamisen aikana, joten menestystekijät ja mittarit määritetään tapauskohtaisesti. Työssä annetaan kuitenkin ohjeita menestystekijöiden ja mittarien määrittämiseen. Näkökulmat pysyvät samoina tapauksesta riippumatta.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis was produced for the Technology Marketing unit at the Nokia Research Center. Technology marketing was a new function at Nokia Research Center, and needed an established framework with the capacity to take into account multiple aspects for measuring the team performance. Technology marketing functions had existed in other parts of Nokia, yet no single method had been agreed upon for measuring their performance. The purpose of this study was to develop a performance measurement system for Nokia Research Center Technology Marketing. The target was that Nokia Research Center Technology Marketing had a framework for separate metrics; including benchmarking for starting level and target values in the future planning (numeric values were kept confidential within the company). As a result of this research, the Balanced Scorecard model of Kaplan and Norton, was chosen for the performance measurement system for Nokia Research Center Technology Marketing. This research selected the indicators, which were utilized in the chosen performance measurement system. Furthermore, performance measurement system was defined to guide the Head of Marketing in managing Nokia Research Center Technology Marketing team. During the research process the team mission, vision, strategy and critical success factors were outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study builds on a previous proposal for assigning probabilities to the outcomes computed using different primary indicators in single-case studies. These probabilities are obtained comparing the outcome to previously tabulated reference values and reflect the likelihood of the results in case there was no intervention effect. The current study explores how well different metrics are translated into p values in the context of simulation data. Furthermore, two published multiple baseline data sets are used to illustrate how well the probabilities could reflect the intervention effectiveness as assessed by the original authors. Finally, the importance of which primary indicator is used in each data set to be integrated is explored; two ways of combining probabilities are used: a weighted average and a binomial test. The results indicate that the translation into p values works well for the two nonoverlap procedures, with the results for the regression-based procedure diverging due to some undesirable features of its performance. These p values, both when taken individually and when combined, were well-aligned with the effectiveness for the real-life data. The results suggest that assigning probabilities can be useful for translating the primary measure into the same metric, using these probabilities as additional evidence on the importance of behavioral change, complementing visual analysis and professional's judgments.