841 resultados para Reliability and Validity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this proposal is to join together the owners of the most advanced CPV technology, with respect to the state of the art, in order to research from its leading position new applications for CPV systems. In addition to opening up new markets, it will unveil possible sources of failure in new environments outside Europe, in order to assure component reliability. The proposed project will also try to improve the current technology of the industrial partners (ISOFOTON and CONCENTRIX) by accelerating the learning curve that CPV must follow in order to reach the competitive market, and lowering the cost under the current flat panel PV significantly within 3-4 years. The use of CPV systems in remote areas, together with harsher radiation, ambient and infrastructure conditions will help to increase the rate of progress of this technology. In addition, the ISFOC s contribution, which brings together seven power plants from seven CPV technologies up to 3 MWpeak, will allow creating the most complete database of components and systems performance to be generated as well as the effects of radiation and meteorology on systems operations. Finally, regarding the new applications for CPV subject, the project will use a CPV system sized 25 kWp in a stand-alone station in Egypt (NWRC) for the first time for water pumping and irrigation purposes. In a similar way ISOFOTON will connect up to 25 kWp CPV to the Moroccan ONE utility grid. From the research content point of view of this project, which is directly addressed by the scope of the call, the cooperative research between UPM, FhG-ISE and the two companies will be favoured by the fact that all are progressing in similar directions: developing two-stage optics CPV systems. In addition to these technology improvements the UPM is very interested in developing a new concept of module, recently patented, which will fulfil all required characteristics of a good CPV with less components and reducing cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is remarkable growing concern about the quality control at the time, which has led to the search for methods capable of addressing effectively the reliability analysis as part of the Statistic. Managers, researchers and Engineers must understand that 'statistical thinking' is not just a set of statistical tools. They should start considering 'statistical thinking' from a 'system', which means, developing systems that meet specific statistical tools and other methodologies for an activity. The aim of this article is to encourage them (engineers, researchers and managers) to develop a new way of thinking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The technical improvement and new applications of Infrared Thermography (IRT) with healthy subjects should be accompanied by results about the reproducibility of IRT measurements in different popula-tion groups. In addition, there is a remarkable necessity of a larger supply on software to analyze IRT images of human beings. Therefore, the objectives of this study were: firstly, to investigate the reproducibility of skin temperature (Tsk) on overweight and obese subjects using IRT in different Regions of Interest (ROI), moments and side-to-side differences (?T); and secondly, to check the reliability of a new software called Termotracker®, specialized on the analysis of IRT images of human beings. Methods: 22 overweight and obese males (11) and females (11) (age: 41,51±7,76 years; height: 1,65±0,09 m; weight: 82,41±11,81 Kg; BMI: 30,17±2,58 kg/m²) were assessed in two consecutive thermograms (5 seconds in-between) by the same observer, using an infrared camera (FLIR T335, Sweden) to get 4 IRT images from the whole body. 11 ROI were selected using Termotracker® to analyze its reproducibility and reliability through Intra-class Correlation Coefficient (ICC) and Coefficient of Variation (CV) values. Results: The reproducibility of the side-to-side differences (?T) between two consecutive thermograms was very high in all ROIs (Mean ICC = 0,989), and excellent between two computers (Mean ICC = 0,998). The re-liability of the software was very high in all the ROIs (Mean ICC = 0,999). Intraexaminer reliability analysing the same subjects in two consecutive thermograms was also very high (Mean ICC = 0,997). CV values of the different ROIs were around 2%. Conclusions: Skin temperature on overweight subjects had an excellent reproducibility for consecutive ther-mograms. The reproducibility of thermal asymmetries (?T) was also good but it had the influence of several factors that should be further investigated. Termotracker® reached excellent reliability results and it is a relia-ble and objective software to analyse IRT images of humans beings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Debido al gran incremento de datos digitales que ha tenido lugar en los últimos años, ha surgido un nuevo paradigma de computación paralela para el procesamiento eficiente de grandes volúmenes de datos. Muchos de los sistemas basados en este paradigma, también llamados sistemas de computación intensiva de datos, siguen el modelo de programación de Google MapReduce. La principal ventaja de los sistemas MapReduce es que se basan en la idea de enviar la computación donde residen los datos, tratando de proporcionar escalabilidad y eficiencia. En escenarios libres de fallo, estos sistemas generalmente logran buenos resultados. Sin embargo, la mayoría de escenarios donde se utilizan, se caracterizan por la existencia de fallos. Por tanto, estas plataformas suelen incorporar características de tolerancia a fallos y fiabilidad. Por otro lado, es reconocido que las mejoras en confiabilidad vienen asociadas a costes adicionales en recursos. Esto es razonable y los proveedores que ofrecen este tipo de infraestructuras son conscientes de ello. No obstante, no todos los enfoques proporcionan la misma solución de compromiso entre las capacidades de tolerancia a fallo (o de manera general, las capacidades de fiabilidad) y su coste. Esta tesis ha tratado la problemática de la coexistencia entre fiabilidad y eficiencia de los recursos en los sistemas basados en el paradigma MapReduce, a través de metodologías que introducen el mínimo coste, garantizando un nivel adecuado de fiabilidad. Para lograr esto, se ha propuesto: (i) la formalización de una abstracción de detección de fallos; (ii) una solución alternativa a los puntos únicos de fallo de estas plataformas, y, finalmente, (iii) un nuevo sistema de asignación de recursos basado en retroalimentación a nivel de contenedores. Estas contribuciones genéricas han sido evaluadas tomando como referencia la arquitectura Hadoop YARN, que, hoy en día, es la plataforma de referencia en la comunidad de los sistemas de computación intensiva de datos. En la tesis se demuestra cómo todas las contribuciones de la misma superan a Hadoop YARN tanto en fiabilidad como en eficiencia de los recursos utilizados. ABSTRACT Due to the increase of huge data volumes, a new parallel computing paradigm to process big data in an efficient way has arisen. Many of these systems, called dataintensive computing systems, follow the Google MapReduce programming model. The main advantage of these systems is based on the idea of sending the computation where the data resides, trying to provide scalability and efficiency. In failure-free scenarios, these frameworks usually achieve good results. However, these ones are not realistic scenarios. Consequently, these frameworks exhibit some fault tolerance and dependability techniques as built-in features. On the other hand, dependability improvements are known to imply additional resource costs. This is reasonable and providers offering these infrastructures are aware of this. Nevertheless, not all the approaches provide the same tradeoff between fault tolerant capabilities (or more generally, reliability capabilities) and cost. In this thesis, we have addressed the coexistence between reliability and resource efficiency in MapReduce-based systems, looking for methodologies that introduce the minimal cost and guarantee an appropriate level of reliability. In order to achieve this, we have proposed: (i) a formalization of a failure detector abstraction; (ii) an alternative solution to single points of failure of these frameworks, and finally (iii) a novel feedback-based resource allocation system at the container level. Finally, our generic contributions have been instantiated for the Hadoop YARN architecture, which is the state-of-the-art framework in the data-intensive computing systems community nowadays. The thesis demonstrates how all our approaches outperform Hadoop YARN in terms of reliability and resource efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DNA sequence analysis dictates new interpretation of phylogenic trees. Taxa that were once thought to represent successive grades of complexity at the base of the metazoan tree are being displaced to much higher positions inside the tree. This leaves no evolutionary “intermediates” and forces us to rethink the genesis of bilaterian complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new interview procedure is proposed for collecting valid information on the acquisition of high-level performance in sport. The procedure elicits verifiable information on the development of athletes' achievements in their primary sport, as well as factors that might influence performance, including involvement in other sporting activities, injuries, physical growth and quality of training resources. Interviewed athletes also describe their engagement in specific training and other relevant activities during each year of their development as well as how they experienced each type of activity. The collected information is then examined to identify those aspects of the athletes' recall of their development that meet criteria of reliability and validity. Recommendations to coaches and scientists are discussed for how retrospective interviews can uncover aspects of development that distinguish elite from less accomplished athletes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Performing organization: Oklahoma State University, College of Business Administration , Stillwater."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bibliography: p. 25.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bibliography: leaves 41-43.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability and validity of parent and teacher report of behavioral inhibition (BI) was examined among children aged 3 to 5 years. Confirmatory factor analysis supported 6 correlated factors reflecting specific BI contexts, each loading on a single, higher order factor of BI. Internal consistency was acceptable, with moderate stability over 1 year and strong correlation with a brief inhibition subscale from a temperament questionnaire. Children who were rated by mothers and teachers as high BI took longer to initiate contact with a stranger, spoke less often and for shorter periods, and required more prompting to elicit speech compared with low-BI peers in a simulated stranger interaction task. Father report of BI was significantly associated with mean duration of speech and eye gaze.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Awareness of optimal behaviour states of children with profound intellectual disability has been reported in the literature as a potentially useful tool for planning intervention within this population. Some arguments have been raised, however, which question the reliability and validity of previously published work on behaviour state analysis. This article sheds light on the debate by presenting two stages of a study of behaviour state analysis for eight girls with Rett syndrome. The results support Mudford, Hogg, and Roberts' (1997, 1999) concerns with the pooling of participant data. The results of Stage 2 also suggest, however, that most categories of behaviour state can be reliably distinguished once definitions of behaviours for each state are clearly defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The H I Parkes All Sky Survey (HIPASS) is a blind extragalactic H I 21-cm emission-line survey covering the whole southern sky from declination -90degrees to +25degrees. The HIPASS catalogue (HICAT), containing 4315 H I-selected galaxies from the region south of declination +2degrees, is presented in Meyer et al. (Paper I). This paper describes in detail the completeness and reliability of HICAT, which are calculated from the recovery rate of synthetic sources and follow-up observations, respectively. HICAT is found to be 99 per cent complete at a peak flux of 84 mJy and an integrated flux of 9.4 Jy km. s(-1). The overall reliability is 95 per cent, but rises to 99 per cent for sources with peak fluxes >58 mJy or integrated flux >8.2 Jy km s(-1). Expressions are derived for the uncertainties on the most important HICAT parameters: peak flux, integrated flux, velocity width and recessional velocity. The errors on HICAT parameters are dominated by the noise in the HIPASS data, rather than by the parametrization procedure.