936 resultados para Reliability in automation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The National Academies has stressed the need to develop quantifiable measures for methods that are currently qualitative in nature, such as the examination of fingerprints. Current protocols and procedures to perform these examinations rely heavily on a succession of subjective decisions, from the initial acceptance of evidence for probative value to the final assessment of forensic results. This project studied the concept of sufficiency associated with the decisions made by latent print examiners at the end of the various phases of the examination process. During this 2-year effort, a web‐based interface was designed to capture the observations of 146 latent print examiners and trainees on 15 pairs of latent/control prints. Two main findings resulted from the study: The concept of sufficiency is driven mainly by the number and spatial relationships between the minutiae observed on the latent and control prints. Data indicate that demographics (training, certification, years of experience) or non‐minutiae based features (such as level 3 features) do not play a major role in examiners' decisions; Significant variability was observed between detecting and interpreting friction ridge features and at all levels of details, as well as for factors that have the potential to influence the examination process, such as degradation, distortion, or influence of the background and the development technique.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Family cohesion and adaptability, as operationalised in the Family Adaptability and Cohesion Scales III (FACES III), are two hypothesised dimensions of family functioning. We tested the properties of a French version of FACES III in school-children (mean age: 13 years; S.D:0.85) recruited from the general population and their parents. Separate confirmatory factor analyses were performed for adolescents and adults. The results of both analyses were compatible with a two-factor structure similar to that proposed by the authors of the original instrument. However, orthogonality between the two factors was only supported in the adult data. Internal reliability estimates were 0.78 and 0.68 in adolescents and 0.82 and 0.65 in adults, for cohesion and adaptability respectively.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: General practitioners play a central role in taking deprivation into consideration when caring for patients in primary care. Validated questions to identify deprivation in primary-care practices are still lacking. For both clinical and research purposes, this study therefore aims to develop and validate a standardized instrument measuring both material and social deprivation at an individual level. Methods: The Deprivation in Primary Care Questionnaire (DiPCare-Q) was developed using qualitative and quantitative approaches between 2008 and 2011. A systematic review identified 199 questions related to deprivation. Using judgmental item quality, these were reduced to 38 questions. Two focus groups (primary-care physicians, and primary-care researchers), structured interviews (10 laymen), and think aloud interviews (eight cleaning staff) assured face validity. Item response theory analysis was then used to derive the DiPCare-Q index using data obtained from a random sample of 200 patients who were to complete the questionnaire a second time over the phone. For construct and criterion validity, the final 16 questions were administered to a random sample of 1,898 patients attending one of 47 different private primary-care practices in western Switzerland (validation set) along with questions on subjective social status (subjective SES ladder), education, source of income, welfare status, and subjective poverty. Results: Deprivation was defined in three distinct dimensions (table); material deprivation (eight items), social deprivation (five items) and health deprivation (three items). Item consistency was high in both the derivation (KR20 = 0.827) and the validation set (KR20 = 0.778). The DiPCare-Q index was reliable (ICC = 0.847). For construct validity, we showed the DiPCare-Q index to be correlated to patients' estimation of their position on the subjective SES ladder (rs = 0.539). This position was correlated to both material and social deprivation independently suggesting two separate mechanisms enhancing the feeling of deprivation. Conclusion: The DiPCare-Q is a rapid, reliable and validated instrument useful for measuring both material and social deprivation in primary care. Questions from the DiPCare-Q are easy to use when investigating patients' social history and could improve clinicians' ability to detect underlying social distress related to deprivation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Excessive drinking is a major problem in Western countries. AUDIT (Alcohol Use Disorders Identification Test) is a 10-item questionnaire developed as a transcultural screening tool to detect excessive alcohol consumption and dependence in primary health care settings. OBJECTIVES: The aim of the study is to validate a French version of the Alcohol Use Disorders Identification Test (AUDIT). METHODS: We conducted a validation cross-sectional study in three French-speaking areas (Paris, Geneva and Lausanne). We examined psychometric properties of AUDIT as its internal consistency, and its capacity to correctly diagnose alcohol abuse or dependence as defined by DSM-IV and to detect hazardous drinking (defined as alcohol intake >30 g pure ethanol per day for men and >20 g of pure ethanol per day for women). We calculated sensitivity, specificity, positive and negative predictive values and Receiver Operator Characteristic curves. Finally, we compared the ability of AUDIT to accurately detect "alcohol abuse/dependence" with that of CAGE and MAST. RESULTS: 1207 patients presenting to outpatient clinics (Switzerland, n = 580) or general practitioners' (France, n = 627) successively completed CAGE, MAST and AUDIT self-administered questionnaires, and were independently interviewed by a trained addiction specialist. AUDIT showed a good capacity to discriminate dependent patients (with AUDIT > or =13 for males, sensitivity 70.1%, specificity 95.2%, PPV 85.7%, NPV 94.7% and for females sensitivity 94.7%, specificity 98.2%, PPV 100%, NPV 99.8%); and hazardous drinkers (with AUDIT > or =7, for males sensitivity 83.5%, specificity 79.9%, PPV 55.0%, NPV 82.7% and with AUDIT > or =6 for females, sensitivity 81.2%, specificity 93.7%, PPV 64.0%, NPV 72.0%). AUDIT gives better results than MAST and CAGE for detecting "Alcohol abuse/dependence" as showed on the comparative ROC curves. CONCLUSIONS: The AUDIT questionnaire remains a good screening instrument for French-speaking primary care.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Perushyväksymistestaus on oleellinen osa S60 alustan julkaisukandidaatin maturiteetin seurannassa. Perushyväksymistestausta tehdään myös ohjelmiston julkistamiskelpoisuuden varmistamiseksi. Testaustulokset halutaan aina mahdollisimman nopeasti. Lisäksi testaustiimin työmäärä on hiljalleen kasvanut, koska projekteja onenemmän ja korjauksia sisältäviä ja räätälöityjä settejä testataan enemmän. Tässä diplomityössä tutkitaan lyhentäisikö testisetin osan automatisointi testien ajoaikaa ja helpottaisiko se testaajien työtaakkaa. Tarkastelu toteutetaan automatisoimalla osa testisetistä ja kokemuksia esitellään tässä lopputyössä.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Both the intermolecular interaction energies and the geometries for M ̄ thiophene, M ̄ pyrrole, M n+ ̄ thiophene, and M n+ ̄ pyrrole ͑with M = Li, Na, K, Ca, and Mg; and M n+ = Li+ , Na+ , K+ , Ca2+, and Mg2+͒ have been estimated using four commonly used density functional theory ͑DFT͒ methods: B3LYP, B3PW91, PBE, and MPW1PW91. Results have been compared to those provided by HF, MP2, and MP4 conventional ab initio methods. The PBE and MPW1PW91 are the only DFT methods able to provide a reasonable description of the M ̄ complexes. Regarding M n+ ̄ ␲ complexes, the four DFT methods have been proven to be adequate in the prediction of these electrostatically stabilized systems, even though they tend to overestimate the interaction energies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction Occupational therapists could play an important role in facilitating driving cessation for ageing drivers. This, however, requires an easy-to-learn, standardised on-road evaluation method. This study therefore investigates whether use of P-drive' could be reliably taught to occupational therapists via a short half-day training session. Method Using the English 26-item version of P-drive, two occupational therapists evaluated the driving ability of 24 home-dwelling drivers aged 70 years or over on a standardised on-road route. Experienced driving instructors' on-road, subjective evaluations were then compared with P-drive scores. Results Following a short half-day training session, P-drive was shown to have almost perfect between-rater reliability (ICC2,1=0.950, 95% CI 0.889 to 0.978). Reliability was stable across sessions including the training phase even if occupational therapists seemed to become slightly less severe in their ratings with experience. P-drive's score was related to the driving instructors' subjective evaluations of driving skills in a non-linear manner (R-2=0.445, p=0.021). Conclusion P-drive is a reliable instrument that can easily be taught to occupational therapists and implemented as a way of standardising the on-road driving test.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tämän työn tarkoituksena on tutkia prosessien uudelleensuunnittelua ja erityisesti hankintaprosessien uudistamista kansainvälisessä prosessiautomaatioyrityksessä. Työn aihealuetta tutkitaan sekä henkilöstön, uuden teknologian että prosessien kehittämisen näkökulmasta. Prosessien kehittämisessä työkaluna on käytetty tunnettua laatutyökalua, Six Sigmaa. Työn päätavoitteena on luoda uudet toimintamallit kehitettäville prosesseille. Kirjallisuuden lisäksi työ perustuu kirjoittajan kokemuksiin ja havaintoihin projektitiimin jäsenenä. Prosessien kehittämistä tarkastellaan myös muutoksenhallinnan näkökulmasta. Tämä on alue, jota ei saisi koskaan jättää huomiotta kehityshankkeissa. Kehitettäessä liiketoimintaprosesseja muutoksenhallinta on menestyksen kannalta ensisijaisen tärkeää. Muutoksenhallintaa tutkitaan työssä sekä organisaation että yksilön kannalta. Tässä työssä esitellään lisäksi maailmanlaajuisesti tunnettu ja käytetty laatutyökalu, Six Sigma. Six Sigmaa on käytetty myös työn lopussa tarkasteltavassa case-osuudessa. Case-osuudessa prosesseja tutkitaan aina nykytilasta uusiin toimintamalleihin saakka. Prosesseja ja niiden suorituskykyä analysoidaan yhtäjaksoisesti kehitysprojektien edetessä.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study the performance measurement, a part of the research and development of the RNC, was improved by implementing counter testing to the Nokia Automation System. The automation of counter testing is a feature the customer ordered, because performing counter testing manually is rather complex. The objective was to implement an automated counter testing system, which once configured correctly, would manage to run the testing and perform the analysis. The requirements for the counter testing were first studied. It was investigated if the auto-mation of the feature was feasible in the meetings with the customer. The basic functionality required for the automation was also drawn. The technologies used in the architecture of the Nokia Automation System were studied. Based on the results of the study, a new technology, wxWidgets, was introduced. The new technology was necessary to facilitate the implementing of the required feature. Finally the implementation of the counter testing was defined and implemented. The result of this study was the automation of the counter testing method developed as a new feature for the Nokia Automation System. The feature meets the specifications and requirements set by the customer. The performing of the counter testing feature is totally automated. Only configuration of the test cases is done by the user. The customer has presented new requests to further develop the feature and there are plans by the Nokia Automation System developers to implement those in the near future. The study describes the implementation of the counter testing feature introduced. The results of the study give guidelines for further developing the feature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Four studies investigated the reliability and validity of thin slices of nonverbal behavior from social interactions including (1) how well individual slices of a given behavior predict other slices in the same interaction; (2) how well a slice of a given behavior represents the entirety of that behavior within an interaction; (3) how long a slice is necessary to sufficiently represent the entirety of a behavior within an interaction; (4) which slices best capture the entirety of behavior, across different behaviors; and (5) which behaviors (of six measured behaviors) are best captured by slices. Notable findings included strong reliability and validity for thin slices of gaze and nods, and that a 1.5 min slice from the start of an interaction may adequately represent some behaviors. Results provide useful information to researchers making decisions about slice measurement of behavior.