765 resultados para procurement measuring
Resumo:
WAP tulee olemaan tulevaisuudessa tärkeässä roolissa, kun etsitään sopivaa tiedonsiirtoprotokollaa uusille mobiilipalveluille. Vaikka WAP jollain tavalla epäonnistui ensimmäisessä tulemisessaan sen suosio varmasti tulevaisuudessa tulee kasvamaan. WAP:in heikko suosio ei johtunut niinkään protokollan tiedonsiirto ominaisuuksista, vaan WAP-palveluiden kehittymättömyydestä. Tulevaisuuden palvelut kuitenkin ovat kehittyneempiä ja WAP:in suosio tulee kasvamaan. Viimeisimpänä WAP:ia käyttävänä palveluna on esitelty MMS. Kun uudet WAP:iin pohjautuvat palvelut yleistyvät, asettaa se uusia vaatimuksia myös WAP gateway:lle. Työssä tarkastellaan erilaisia mahdollisuuksia mitata mobiilisti WAP palvelujen palvelun tasoa. Työssä myös toteutetaan mobiili WAP palveluiden mittauskomponentti, joka toimii osana laajempaa ohjelmistoa. Tarkoituksena on toteuttaa mittauskomponentti, joka emuloi mahdollisimman hyvin todellista loppukäyttäjää.
Resumo:
Social reciprocity may explain certain emerging psychological processes, which are likely to be founded on dyadic relations. Although some indices and statistics have been proposed to measure and make statistical decisions regarding social reciprocity in groups, these were generally developed to identify association patterns rather than to quantify the discrepancies between what each individual addresses to his/her partners and what is received from them in return. Additionally, social researchers are not only interested in measuring groups at the global level, since dyadic and individual measurements are also necessary for a proper description of social interactions. This study is concerned with a new statistic for measuring social reciprocity at the global level and with decomposing it in order to identify those dyads and individuals which account for a significant part of asymmetry in social interactions. In addition to a set of indices some exact analytical results are derived and a way of making statistical decisions is proposed.
Resumo:
Diplomityön tavoitteena oli luoda suunnitelma elektronisen hankintatoimen aloittamiseksi, analysoimalla nykyisiä ostoprosesseja ja konsernin laajuista elektronista hankintajärjestelmää. Työ pohjautuu yritysten väliseen elektroniseen kauppaan, elektroniseen hankintatoimeen ja systeemisuunnitteluun liittyvään kirjallisuuteen. Työssä tehdyn suunnitelman tarkoituksena on auttaa Siemens Oy:tä siirtymään uuteen elektroniseen ostotoimintaan. Elektroniselle ostotoiminnalle suunniteltiin tavoitteet ja näitä vastaavat vaatimukset. Elektronisen hankintajärjestelmän analysointi perustuu kirjallisuudessa esitettyihin järjestelmän elinkaari-mallin vaiheisiin. Analysoinnin tarkoituksena oli saada selville järjestelmän soveltuvuus Siemens Oy: n liiketoimintaympäristöön, prosesseihin ja vaatimuksiin. Elektronisen hankintatoiminnan etuja ovat liiketoiminta prosessien johtamisen parantuminen, kustannusten väheneminen sekä taloudellisen suorituskyvyn lisääntyminen. Elektronisen hankintatoiminnan aloittaminen vaatii kuitenkin huolellista suunnittelua. Työssä tehdyt suunnitelmat ja analysoinnit auttavat arvioidessa järjestelmän sopivuutta Siemens Oy: n vaatimuksiin. Oikean ja toimivan järjestelmän valinta ei kuitenkaan takaa elektronisesta hankintatoiminnasta hyötymistä. Tärkeimpiä jatkotoimenpiteitä onkin suorittaa kustannus/hyöty analyysi ja arvioida toimittajien halukkuutta ja kykyjä osallistua markkinapaikkaan.
Resumo:
In this study the theoretical part was created to make comparison between different Value at Risk models. Based on that comparison one model was chosen to the empirical part which concentrated to find out whether the model is accurate to measure market risk. The purpose of this study was to test if Volatility-weighted Historical Simulation is accurate in measuring market risk and what improvements does it bring to market risk measurement compared to traditional Historical Simulation. Volatility-weighted method by Hull and White (1998) was chosen In order to improve the traditional methods capability to measure market risk. In this study we found out that result based on Historical Simulation are dependent on chosen time period, confidence level and how samples are weighted. The findings of this study are that we cannot say that the chosen method is fully reliable in measuring market risk because back testing results are changing during the time period of this study.
Resumo:
It has been shown in recent ALICE@LHC measurements that the odd flow harmonics, in particular, a directed flow v1, occurred to be weak and dominated by random fluctuations. In this work we propose a new method, which makes the measurements more sensitive to the flow patterns showing global collective symmetries. We demonstrate how the longitudinal center of mass rapidity fluctuations can be identified, and then the collective flow analysis can be performed in the event-by-event center of mass frame. Such a method can be very effective in separating the flow patterns originating from random fluctuations, and the flow patterns originating from the global symmetry of the initial state.
Resumo:
The functional method is a new test theory using a new scoring method that assumes complexity in test structure, and thus takes into account every correlation between factors and items. The main specificity of the functional method is to model test scores by multiple regression instead of estimating them by using simplistic sums of points. In order to proceed, the functional method requires the creation of hyperspherical measurement space, in which item responses are expressed by their correlation with orthogonal factors. This method has three main qualities. First, measures are expressed in the absolute metric of correlations; therefore, items, scales and persons are expressed in the same measurement space using the same single metric. Second, factors are systematically orthogonal and without errors, which is optimal in order to predict other outcomes. Such predictions can be performed to estimate how one would answer to other tests, or even to model one's response strategy if it was perfectly coherent. Third, the functional method provides measures of individuals' response validity (i.e., control indices). Herein, we propose a standard procedure in order to identify whether test results are interpretable and to exclude invalid results caused by various response biases based on control indices.
Resumo:
In recent years there has been growing interest in composite indicators as an efficient tool of analysis and a method of prioritizing policies. This paper presents a composite index of intermediary determinants of child health using a multivariate statistical approach. The index shows how specific determinants of child health vary across Colombian departments (administrative subdivisions). We used data collected from the 2010 Colombian Demographic and Health Survey (DHS) for 32 departments and the capital city, Bogotá. Adapting the conceptual framework of Commission on Social Determinants of Health (CSDH), five dimensions related to child health are represented in the index: material circumstances, behavioural factors, psychosocial factors, biological factors and the health system. In order to generate the weight of the variables, and taking into account the discrete nature of the data, principal component analysis (PCA) using polychoric correlations was employed in constructing the index. From this method five principal components were selected. The index was estimated using a weighted average of the retained components. A hierarchical cluster analysis was also carried out. The results show that the biggest differences in intermediary determinants of child health are associated with health care before and during delivery.
Resumo:
The RFLP/PCR approach (restriction fragment length polymorphism/polymerase chain reaction) to genotypic mutation analysis described here measures mutations in restriction recognition sequences. Wild-type DNA is restricted before the resistant, mutated sequences are amplified by PCR and cloned. We tested the capacity of this experimental design to isolate a few copies of a mutated sequence of the human c-Ha-ras1 gene from a large excess of wild-type DNA. For this purpose we constructed a 272 bp fragment with 2 mutations in the PvuII recognition sequence 1727-1732 and studied the rescue by RFLP/PCR of a few copies of this 'PvuII mutant standard'. Following amplification with Taq-polymerase and cloning into lambda gt10, plaques containing wild-type sequence, PvuII mutant standard or Taq-polymerase induced bp changes were quantitated by hybridization with specific oligonucleotide probes. Our results indicate that 10 PvuII mutant standard copies can be rescued from 10(8) to 10(9) wild-type sequences. Taq polymerase errors originating from unrestricted, residual wild-type DNA were sequence dependent and consisted mostly of transversions originating at G.C bp. In contrast to a doubly mutated 'standard' the capacity to rescue single bp mutations by RFLP/PCR is limited by Taq-polymerase errors. Therefore, we assessed the capacity of our protocol to isolate a G to T transversion mutation at base pair 1698 of the MspI-site 1695-1698 of the c-Ha-ras1 gene from excess wild-type ras1 DNA. We found that 100 copies of the mutated ras1 fragment could be readily rescued from 10(8) copies of wild-type DNA.
Resumo:
This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.
Resumo:
Network neutrality is a growing policy controversy. Traffic management techniques affect not only high-speed, high-money content, but by extension all other content too. Internet regulators and users may tolerate much more discrimination in the interests of innovation. For instance, in the absence of regulatory oversight, ISPs could use Deep Packet Inspection (DPI) to block some content altogether, if they decide it is not to the benefit of ISPs, copyright holders, parents or the government. ISP blocking is currently widespread in controlling spam email, and in some countries in blocking sexually graphic illegal images. In 1999 this led to scrutiny of foreclosure of Instant Messaging and video and cable-telephony horizontal merger. Fourteen years later, there were in 2013 net neutrality laws implemented in Slovenia, the Netherlands, Chile and Finland, regulation in the United States and Canada , co-regulation in Norway, and self-regulation in Japan, the United Kingdom and many other European countries . Both Germany and France in mid-2013 debated new net neutrality legislation, and the European Commission announced on 11 September 2013 that it would aim to introduce legislation in early 2014. This paper analyses these legal developments, and in particular the difficulty in assessing reasonable traffic management and ‘specialized’ (i.e. unregulated) faster services in both EU and US law. It also assesses net neutrality law against the international legal norms for user privacy and freedom of expression
Resumo:
This paper presents a composite index of early childhood health using a multivariate statistical approach. The index shows how child health varies across Colombian departments, -administrative subdivisions-. In recent years there has been growing interest in composite indicators as an efficient analysis tool and a way of prioritizing policies. These indicators not only enable multi-dimensional phenomena to be simplified but also make it easier to measure, visualize, monitor and compare a country’s performance in particular issues. We used data collected from the Colombian Demographic and Health Survey, DHS, for 32 departments and the capital city, Bogotá, in 2005 and 2010. The variables included in the index provide a measure of three dimensions related to child health: health status, health determinants and the health system. In order to generate the weight of the variables and take into account the discrete nature of the data, we employed a principal component analysis, PCA, using polychoric correlation. From this method, five principal components were selected. The index was estimated using a weighted average of the components retained. A hierarchical cluster analysis was also carried out. We observed that the departments ranking in the lowest positions are located on the Colombian periphery. They are departments with low per capita incomes and they present critical social indicators. The results suggest that the regional disparities in child health may be associated with differences in parental characteristics, household conditions and economic development levels, which makes clear the importance of context in the study of child health in Colombia.
Resumo:
Improving educational quality is an important public policy goal. However, its success requires identifying factors associated with student achievement. At the core of these proposals lies the principle that increased public school quality can make school system more efficient, resulting in correspondingly stronger performance by students. Nevertheless, the public educational system is not devoid of competition which arises, among other factors, through the efficiency of management and the geographical location of schools. Moreover, families in Spain appear to choose a school on the grounds of location. In this environment, the objective of this paper is to analyze whether geographical space has an impact on the relationship between the level of technical quality of public schools (measured by the efficiency score) and the school demand index. To do this, an empirical application is performed on a sample of 1,695 public schools in the region of Catalonia (Spain). This application shows the effects of spatial autocorrelation on the estimation of the parameters and how these problems are addressed through spatial econometrics models. The results confirm that space has a moderating effect on the relationship between efficiency and school demand, although only in urban municipalities.
Resumo:
Previous studies have examined the experience of owning a virtual surrogate body or body part through specific combinations of cross-modal multisensory stimulation. Both visuomotor (VM) and visuotactile (VT) synchronous stimulation have been shown to be important for inducing a body ownership illusion, each tested separately or both in combination. In this study we compared the relative importance of these two cross-modal correlations, when both are provided in the same immersive virtual reality setup and the same experiment. We systematically manipulated VT and VM contingencies in order to assess their relative role and mutual interaction. Moreover, we present a new method for measuring the induced body ownership illusion through time, by recording reports of breaks in the illusion of ownership ("breaks") throughout the experimental phase. The balance of the evidence, from both questionnaires and analysis of the breaks, suggests that while VM synchronous stimulation contributes the greatest to the attainment of the illusion, a disruption of either (through asynchronous stimulation) contributes equally to the probability of a break in the illusion.